>It sucks donkey balls
>Average 96.5 FPS at 1440p.
cool, im still using an old amd card because why the fuck would I buy a new graphics card for shit modern games
>have a 3070 and a 4k monitor
>want a 4060Ti 16GB for AI shit
>but it'll most likely perform like shit in games
The fuck do I do, bros?
Get a 3090
That'd kill my 750W PSU, wouldn't it
Currently running one with a 750w and it seems to be fine overall, probably the bare minimum
4060ti is as powerful as a 3070
The 4060ti is barely beating the 3060ti, it most certainly not as powerful as the 3070
the 4060 Ti is at BEST a side grade to a 3070. You're going to have to go up to a 4080 or more for 16+GB. Or pray nvidia releases more models with "extra" VRAM.
I guess the 4080 doesn't look as bad when you don't have tech reviewers breathing over your neck right? I don't know why they got so pissed off at it, it's a great value for AI + vidya at a decent (post-gpu crisis) price. Some fags where saying "dude just get a 3090 lmao" but it's not the same price, doesn't have DLSS3 and idk if it has AV1
AV1 is only on 4xxx cards if I'm not mistaken
I tried it with OBS and encoding my gameplay at 25Mbps AV1 gives some really fucking good quality footage
Too bad premiere doesn't support AV1
Adobe is a blight, soon to go the way of Autodesk.
4060ti is a flat out downgrade, it's crazy just how bad the product stack is.
No point getting anything other than a 4090 or 4080 depending on regional pricing.
The only thing that soothes my wallet buying a 4080 is that I literally didn't have a PC before buying it and that it'll last me until it dies for 1440p
Tf you need AI shit for? If you really want to do AI shit, you'd be willing to spend more anyway.
>Tf you need AI shit for?
Artist. SD cuts a whole slice out of my workload for generic tasks.
>you'd be willing to spend more anyway
I am. Most likely I'll be getting a 3090 like someone else suggested. I was hoping for the 4060/16 to turn out OK because of power consumption.
You're not an artist if you use AI.
Alright, Give me your best shot.
Tell me why.
And remember to remain factual.
Talent doesn't care about your feelings.
1) a lot of art websites started to ban AI "art"
2) people are training AI "art" on CP, do you really want to involve yourself into this disgusting shit?
>fartist seething + bullshit
You're only in danger if you made "simple" stuff prior to it anyway, storytards are safer than ever.
It's a well known fact people started training AI with CP, but yeh keep coping about that I guess.
Your proofless boogeyman won't work here you know
>people are training AI "art" on CP, do you really want to involve yourself into this disgusting shit?
Yeah that /b/ thread feels like it's a honey pot, I feel like there's gonna be a reckoning soon with this shit because there's no way to tell what people are training on
I'm glad pixiv started banning that shit, I was pretty neutral on AI art until the gays started fucking spamming it constantly so I have to dig through 10 AI pictures for every actual picture.
Hack mad that AI is more talented than he is hahahaha
>still strawmanning with his proofless Boogeyman
You have zero talent as an artist and no one will ever buy your shit. You will never be successful.
Enjoy your jail sentence, child molester.
Explain your use case of how text-to-image ai generation shaves time off your workflow. Most every excuse will result in your "art" having no value to even glance at. Obsession over vram in gpu's for the sake of the latest frothing ai meme is just as bad as cryptobros buying 20 gpus to mine ethereum.
You have terrible bait. Commission some better worms from actual artists.
This. Started second guessing every other image just becuase of how deeply rooted they are in their lifted references. It's all dogshit, even if it will only improve from here.
Meme resolution that makes you wasted too much on gpu/cpu to use
It's an ultrawide, actually. Comes in handy for displaying references and such.
Your only upgrade path is the 4090
Used 3090s are going for 700-800 all day.
Yeah, Palits one maybe.
So it's better to buy 3060 line if I just to play at 1080p?
The RE4 Remake results are probably the funniest. Nothing changes at all between the 3060 Ti and 4060 Ti. Modern Warfare 2 was the only game that had noticeably large gains in FPS. Fuck off Nvidia.
Yeah. The 3060 or the 3060 Ti are good 1080p cards. Unless you're putting literally every option on ultra in every game ever, they should be fine for most games.
Ultra settings are a meme at this point.
Only for third world monkeys
>Ultra settings are a meme at this point.
They always were. Ultra is literally just games without optimization for ANY hardware. If you can run a game on Ultra settings at your desired framerate and resolution, then cool, you got a strong card, but that's all it means.
is it good for a modded skyrim and Fo4 gameplay?
Should be good for modded Bethesda games. It just depends on how fucking stacked you want the game to be with the mods. There's probably a breaking point where once you add like 6 million things you'll only be able to run the game with a GPU with lots of VRAM.
Maybe someone else can chime in on that.
>so it's better to buy obsolete hardware
no you dumbass, you're too late, wait for the next gen and hope it isn't a total scam like this one
I'm totally illiterate on PC technology, always been a consolefag but I'm considering looking at switching to PC instead of PS5 this gen. Why are the 3k and 4k cards so bad? And how long will it take for the 5k series or whatever to release. And will it cost ridiculous money like these cards too? Is it even worth building a PC
>Why are the 3k and 4k cards so bad?
Too expensive for what you're getting.
Stop being a braindead sheep consumer and buy AMD with far more performance for the same price or less.
While its a old card if its cheap then its good.
4060 and 4060Ti makes no sense, the only good option would be the RX 6700 but it is also old and the RX 7700 has yet to come out.
The one with functional drivers.
you failed the iq test sir
amd has better drivers than nvidia
Why would you lie on the internet bro
amd has way better drivers than nvidia on gnu/linux, nvidia has dropped kepler support while tahiti still works
It's not 2015 anymore, AMD gpu's have no driver issues.
I use a 2060p for 4k dafuq do they mean it's a 1080p card
My 6800XT is kicking it's ass at stock settings lul
>My 3060 has 12GB
Why does this thing exist?
To go into Walmart prebuilts.
the 4060 is weaker than the 3060? i don't understand these gpu naming schemes
Way weaker, the main problem is that nvidia is cheaping out in VRAM, less chips = less speed and the amount of VRAM is also lame for the price.
Nvidia is made up of garden gnomes and they want you to think
>big number must be better!
*laughs in 3060 12gb 192 bit bus*
The shift in priorities to DLSS and Ray Tracing has begun. Raw rendering power is a thing of the past.
In theory, DLSS is a good thing because it will greatly future proof hardware.
>proprietary technology future proofing anything
Tell me you're a goddamn retard without telling me you are
FSR at least was made in response to dlss, so at least garden gnomevidia caused something good to happen.
DUDE JUST SLAP DLSS ON IT
BUY 5XXX SERIES THIS YEAR GOY
YOUR 4090 IS ALREADY OUTDATED
The series 4XXX is shit, the 3XXX is shit too
I'm still using a 1080 Ti at 1440p and I have no reason to "upgrade"
very poor performance
Oh fuck of, I'm an amd fag but amd cards do way better in this game, this is not a fair comparisson.
A 7900xtx destroys a 4090 in this game
T. 7900xtx owner
>less performance than previous gen's same tier GPU
it's not the same tier
RTX 3060 Ti was using GA104, RTX 3060 used GA 106
RTX 4060 Ti is using AD106, RTX 4060 uses AD107
4060 Ti is 3060 successor, not 3060 Ti successor
All of this doesn't matter if it costs like a 3060Ti instead of a 3060. As of right now its direct competition is 3060Ti.
yes and because it's misnamed and overpriced it's now getting the nickname of "waste of sand"
The point isn't AMD vs Nvidia, it's that the 4060 Ti barely outperforms the 3060 Ti.
A 7900xtx beats the 4090 in a lot of games.
It beats it in COD, for example.
These companies are actually closer to each other in tech than you’d think.
Now do it with DLSS on
>now do it al lower resolution
enjoy your pixelated fake frames
DUDE JUST RENDER EVERYTHING IN 720P LMAO
fuck off gay pajeet its not like you can afford those cards anyway.
>now degrade the image quality
>using DLSS on 1080p
it just werks
>720p on huge screen
>he doesn't have a gaming laptop
lmaoing @ your life
>gayming heater laptop
Hope you guys are ready for the influx of shills telling you to enable DLSS now because this is the future you are getting. Jesus.
The worst part about dlss is that it fails to address the purpose of high fps. The point of high fps is for image consistency and being able to act during those frames in multiplayer games. AI generated frames are not accurate to what is happening in the game, this means you may be shooting ghost limbs in an fps or miscalculating a jump in a platformer.
I know VRAM is a mem everyone is shilling now for some reason.
I literally just saw a video where someone says he would choose the 6700XT over 3060 Ti even when disregarding price differences, because it has more VRAM and that makes it better for higher resolutions.
All his test cases however show that 3060 Ti performs on average the same on 1440p if not better at lower temp.
It's not a meme people just don't understand that because it's a bottleneck for some cards doesn't mean that a card with more of it is going to blow it out of the water
The issue is that VRAM doesn't matter...until you run out of it.
So if an 8 GB GPU is running out of memory on max settings in 2022, by 2025 you'll be hitting those memory limits on medium.
what changed? 10-20 years ago nobody gave a shit about vram. 1GB was enough for almost 10 years (between 2005 and 2015 VRAM "only" doubled)
devs are lazy and dont optimize the textures, so all the textures are 4k ones on max settings
>10-20 years ago nobody gave a shit about vram.
That's definitely not true, I remember my old family PC with some shitty Intel chipset shitting itself in Battlefield 1942 because it didn't have enough memory.
>shitty intel chipset
so onboard graphics. They used to be even worse back then, VRAM was probably not the only issue
your textures will look like shit when there's not enough vram to handle it
Nah, sorry anon, as a true PC gamer chad I don't care about texture quality, it's fine if this new $400 GPU has worse textures in current games than a fucking console.
I never heard about gamernexus till like a year ago but holy shit they've been putting in the work to prove dumbfuckery and scams
>May need to reduce settings in 1080p games
All I needed to know.
Don't care, still buying AMD.
>Don't care, still buying AMD.
Might as well for everyone unless you care about ray tracing, DLSS, or building a white-themed PC.
>o11 dynamic xl
Wow that was hard.
>still buying AMD
You lose either way, since AMD doesn't care about using the Nvidia flunkers to gain market share as they find themselves perfectly comfortable playing second fiddle.
I hope you bought Arc.
nvidia don't have long left. Intel are getting better and better and the Arc series were standing with the RTX 3000 cards for half the cost.
AMD are perfectly capable of destroying Nvidia like they did with Intel but just choose not too. Heck, even Intel cards aren't even that bad for how cheap they are.
Last time AMD DESTROYED Nvidia, FX was a thing and ATI(now part of AMD) unleashed real time tesselation that worked wonders even for morrowind and other games.
the 2000's was a decade of AMD and ATI curbstomp.
>EVGA is proven right with every release
WE DIDN'T LISTEN
>in some instances it is worse than 3060Ti
The garden gnomery is out of this world.
As much as I want to buy AMD, CUDA is simply too powerful to let go.
What do you use it for?
The 4060ti has less CUDA than the 3060ti.
Besides, if you actually cared about CUDA you would just get a 2080ti which is like the same price with vastly more CUDA.
But you don’t because CUDA is only brought up by larpers.
Nvidia are scammers
>RTX 3070ti is a 5% peformance increase and runs much hotter
Going from 8 to 16GB RAM for $100 is a scam too.
It should be around $31 and that is using low scale prices (x1000 micron chips).
I remember buying a 780 for 350 usd for my first high end PC. Now a 4080 costs 1600 usd and it's only getting worse
I got my 4080 for $900 on amazon warehouse. It was basically brand new.
You know they're not bashing themselves for going for more reasonable vram config, or adjusting the price or amount of rt cores.
They'll have some exe being upset they made 3000 series gpus too powerful so they make this genes shitpiles look worse.
Well, this is effectively the successor to the RTX 3060. It's even using the 106 chip.
It should have been called 4060, and what they're calling 4060 should have been the 4050 Ti.
Now they deal with the consequences of misnaming and charging too much for a product
>Now they deal with the consequences of misnaming and charging too much for a product
There are none. You keep buying NVIDIA.
people only buy nvidia because AMD are not up to scratch.
Look at what Ryzen did to Intel. If the performance to cost ratio is there, people will flock to it.
So there are no consequences, because you believe you can only buy NVIDIA.
it takes time. AMD's day will come.
That time could be now if AMD wasn't committed to being only SLIGHTLY less gnomish than nvidia. They should be significantly undercutting nvidia.
amd has been "up to scratch" this entire time, shills just worked extra hard to scream about MUH DRIVERS even though I've used an AMD card 6 years ago with no issues and have driver issues with my current Nvidia card, and MUH RAYTRACING even though only the most powerful and most overpriced cards can actually drive it properly without crippling the performance just because AMD "can't do it".
They don't significantly undercut Nvidia. That's the problem. People don't want to pay $900 for a flagship GPU, they want to pay $400-$450.
AMD literally can't afford cutting their prices in half. They're already undercutting the 4080 by 200 dollars with 7900XTX, they've been undercutting the GTX 1060 for YEARS with the RX4/580 and everyone was still buying the 1060 instead, I personally seen people buy the gimped 1060 3GB when the 580 was the same price because "uhhh i heard it crashes" and when the GTX 480 was LITERALLY THE LAUGHING STOCK OF THE INDUSTRY the retards still bought Fermi in droves instead of the HD5000 series.
I wish someone made an update to this video, but even at 4 years old it's still so fucking true.
I had that one and it was truly a gem for his range and price
>You keep buying NVIDIA
I haven't bought an Nvidia card in 6 years, still going strong on 1080ti, the perfect card.
The current problem is that some new bullshit consecutively releases that allows Nvidia to stay afloat outside consumer market. 3 years ago it was crypto, not it's AI, and they're earning so fucking much on corpo AI that I wonder how many fabrication lines have shifted to the H series considering consumer cards doesn't sell, not that it fucking matters.
Not me, my last GPU was a 1070 and I got a 6950 XT earlier this year for $640.
not my problem
So the 128 bit bus really hurts it at anything beyond 1080p?
7600 VILL save us
The 7600 will be the same flavor of e-waste just under a different brand name.
>299 when 10% weaker 6600 is 200 and a lot stronger 6700xt is 350
nope, amd is a shitsow this gen just like njudea
>299 for 7600
you made it up
>200 for 6600
you made it up
you made that one up too
>you made it up
>you made it up
just kys shill
still on my 960 GTX, though I've been considering upgrading for years now.
since the 4060 Ti is a confirmed flop, should I look forward to the next series or get a 3060 Ti? I'm 1080p, might go for 1440p or just higher refresh rate. I'm not in a hurry either
go for AMD at this point. 5000 series is just going to be another overpriced line of cards. I wouldn't go paying full price for a RTX 3060ti if you want one though, the card isn't that great either with 8gb vram.
yeah frankly that's another option I had in mind
alright, I'll look into AMD. thanks
I was under the impression it's more of a 1080p range card. Also how far above 60fps do people need? I mean I enjoy the smoothness of high refresh rate from time to time but it's not something I'm absolutely frothing at the mouth to lust over.
Yes, I expect it to perform decently at 1440p.
>tfw all i play are games no older then 1999
I don't care at all how it performs at 1440p, I care how it performs at 640x480. High resolution holds graphics back and there is already graphics that is too demanding for real time rendering, there always was. Path tracing in particular is interesting becauae it is pretty. UE can do a shit fake job of it via lumen. Every game should do path tracing. And use remote processing/gpu & cpu shared processing network using the internet and otherwise dormant hardware. Current cheaper cards aee shittier but still use the densest chips available, meaning they draw less power and probably operate at lower temps.
It is a shame idiots don't want fast cards, they want cards that can do a lot of pixels at once.
This shit is a scam
why is the 4090 never on these lists
Because it blows the fuck away anything just like the 7900 XTX.
It's gotten almost linear in performance.
400$ is the baseline.
You pay 800$ you get double so a 7900 XT
You pay 50% more for a 7900 XTX you get about 50% more.
You pay 100% more for a 4090 and you get about 80% more.
So there no like value propositions at all. You get the best if you want the best.
Man how things have changed since the 1xxx generation jesus christ.
970 and 1070 were like 80-85% of the performance of their 80 counterparts for 50% less.
look at that RTX 3070 - 4070 comparison. What the actual fuck? The amount of garden gnomery is unbelievable.
>weaker than my 3060ti
What in goddamn
If you look at the relative core amounts to the top gpu a
4080 = 3070
4070 = 3060
4060 = 3050
Keep in mind they updated this game so that at ultra the textures are constantly cycling to potato mode if your card isn’t powerful enough to load the textures.
So a lot of the fps numbers on that list probably aren’t for “real” ultra.
T-the left looks b-better anyways.
L-look at how much b-better it runs, b-based Nvidia.
>128 bit bus memory
Yeah literally everyone could have told you that
Not my problem, only idiots still give money to nFailia
doesn't matter the retard masses will still buy it.
At least it can interpolate frames, eh?
Any 4090 chads?
pick only 1
I can't imagine what sort of retard would even buy an AMD gpu. I had an RX480 years ago and I'll never give them money again
The 480 wasn't even that bad, besides you're a sub-moron if you shill for nvidia now.
They might have been bad in the past but the 6000x cards are easily the best things to buy right now and I've had zero driver issues in the past year and a half of running a 6800xt.
the only choice
gpu market is entirely fucked
I have the same rig. Got an oled monitor to top it off
been thinking about a monitor upgrade but haven't decided whether it's worth it
currently running 1440p 165hz, not sure whether 240 hz will be enough of an upgrade, but oled sure would be
you're a fucking retard, not a chad. should've bought a beat-up miata and gone lemon racing
How are hardware reveiws so based? They're not afraid to roast the companies for releasing bad products unlike game reviewers
dont worry, just buy the RTX 4065ti. Gobble that shit up, goy.
Because performance of a gpu is vastly different than an opinion on a video game.
hardware companies haven't realized that if they put minorities on their box art then they can claim reviewers to be racist and then get the "work for free" woke armies of the internet to cancel the reviewers.
It's only a matter of time before they start doing it though.
There was a big controversy several years back because nvidia stopped sending review samples to hardware unboxed because they refused to focus on ray tracing performance in reviews (there were only a handful of ray tracing games available back then). After the shitstorm online nvidia gave in and now they're walking on eggshells with hardware reviewers in order to avoid a similar situation
>How are hardware reveiws so based?
Almost all big ones got banned for something and receive more money from the sites they release stuff on than from Nvidia would ever give them.
That is why only small channels suck nvidia dick nowdays.
Imagine paying 80% of the price of a current console for a GPU that's barely more powerful but has less VRAM despite being 2.5 years newer.
why do retards want their image to be blurry again?
DLAA is the best form of antialiasing there is.
I'm going AMD for my next card
>Average 96.5 FPS at 1440p.
so just like 3070Ti then? cool
Why do these fucks not put the 3080ti and 3090 on their tests anymore?
Hidden agenda? keeping the good shit to themselves? literally nobody reports on those cards yet they destroy everything you throw at them.
Maybe it’s just time for the industry to start hiring competent devs who can actually do their jobs instead of using new GPU’s to bruteforce their way to optimisation.
Oh wait…the competent devs have all left and found real jobs. Pull the fucking plug on this dead horse already
VRAM is literally the easiest thing to slap on to a graphics card.
They slap +64GB on professional cards all day.
If game devs are incompetent then GPU makers are actual knuckle dragging jiggaboos.
Is there ANY reason to upgrade from a 1070?
>2000 series is just an rtx jump
>3000 is scalped to shit, just wait for 4000
>4000 costs more than an entire functional PC unless you want a 5% upgrade to a 6+ year old card
DLSS is genuinely great. Worth grabbing a used 2000/3000 card for.
>DLSS is genuinely great
>USE UPSCALING ON PC GOY!
>UPSCALING IS GOOD
>SPEND MONEY FOR UPSCALING GOY!
literally moved to pc to escape from the blurry upscaling hellhole in consoles where 4k looked like upscaled 480.
because consoles used shitty checkerboard upscaling. DLSS is much better, hell FSR is much better.
The only choices on PC nowadays are blurry and smeary TAA/DLSS or jaggy and shimmering no AA. I miss proper MSAA.
All VR games have MSAA because it gives the highest amount of detail at small pixel counts (VR headsets have relatively low pixel density). 1080p would be still viable if MSAA was an option. 1080p with some MSAA gives similar amount of detail as 1440p with TAA and is easier to run.
DLSS IS FUCKING AWFUL shut your fucking retarded mouth
These gays are starting to botch vulkan support for maxwell cards already. 1xxx is on the line too.
I don't see any reason to upgrade from my 1080, every card now is very overpriced for what thay offer
Starve this shitty industry to death, this is the mission of every true human being.
>subjectivity is not a problem when I do it
NTA but ATI/AMD is known to have shitty drivers for 2 decades now, and I personally had problems with them until I switched to 1080.
Admitting to having a problem is the first step to fixing it.
if you buy anything but the 4090 or 4080 you're retarded
and even 4080 is not that good considering 3090 costs less and is about as fast
glad to be a 4090 chad
>4080 is not that good
I hate this meme. It's 40% weaker and exactly 40% cheaper than a 4090.
You'd have to go with a used 3090 to get a better deal and a lot of people are not comfortable buying used cards for a reason.
Also you kindly omit to mention the MASSIVE FUCKING SIZE of the 4090 and power requirement (needing 1000w+ PSU in case of transient spikes)
It doesn't fit the same niche at all
>MASSIVE FUCKING SIZE of the 4090
This is honestly the reason why I upgraded from 3080 in the first place. It looks so fucking chunky and sexy. Had to actually buy a new case lmao
What case? I have a 4000d airflow and I'm probably going to need to upgrade when I get my 4090
Man I've got a 4000D and the 4080 barely fits. I think a 4090 would compeltely block out the lower PCI ports on my motherboard
what do you need the other PCI ports for anyway?
My msi z690-a motherboard has a really shitty intel NIC (I225-V) and even with drivers and firmware updates it has massive spikes in my lan (over 30ms). I got a 20euros realtek NIC and it fixed all my issues.
oh right, I remember reading about that issue
5000d actually, there's a lot of space. I had s340 elite from like 2017 before
In what world is the 3090 about as fast as the 4080?
Good thing I bought a 4070Ti instead of waiting more.
Is that a good investment? That's the one I've been thinking of getting
Maybe if you're only interested in 1440p and not higher. The bus width is pretty bad and I don't think it'll fare well for 4K or VR
Couldn't give a fuck less about going higher than 1440p. Good to know. I really don't feel like picking up a stupid overpriced 3090ti
It has been fine so far, the VRAM is quite limiting but it can handle 4k with DLSS for now. I'll probably resell it next gen for a newer model.
that card needs a better bus and needs to be like 200-250 bucks. then it would be the next 1080gtx. maybe ad 16gb of memory for 30-50bucks more.
for what it is it is insanely overpriced.
I wanted a 4090 but I'm not sure I can fit it in my fucking case. I'll just wait for a 5080 or maybe a 4070 super if they get their shit together
>4060 Ti is trash
>4070 is trash
>4070 Ti is trash
So it's 4080 or nothing?
but 4080 is also trash
Why? 16 gig is plenty of memeram even for the broken ass ports unless you want 4k native. I don't understand why it gets shit on besides price but the whole lineup sucks price wise
Because of how much worse it is than 4090 while being so expensive
Its the traditional shitty middle option that makes the more expensive model seem more appealing.
basically yeah. Even though the 4080 is overpriced it's the only good option under the 4090
except 4080 is double the price of a 4070
it's actually just nothing because everything is SEVERELY overpriced. you gays need to learn to stop consooming instead of smugly posting "heh, why are you complaining, are you POOR" as you remortgage your house.
Should I upgrade from a 2070 to a 4060 ti with 16gb?
Fuck no, wait for the next generation
Do it so I can laugh at your retardation.
No, wait for the next gen, I'm on a 2080 super and beyond piss optimized games and ports you are perfectly fine
2070 is still a good card for now.
Honestly if you are having performance issues and have some extra cash I'm not sure why people wouldn't just buy a ps cinco for 500 bucks and save for the 5xxx series after that. I don't really see the train of shitty ports ending anytime soon which makes anything but the absolute top of the line completely unattractive. Plus the you get to play FF16/Spidey man
no go for 30 series or wait
Jacket man really playing you all for absolute fools. The best part? You retards are STILL going to buy it in droves and it will STILL get to the top of the Steam hardware survey
>Implying anything will overtake the 1060
I'm not replacing my 2070s anytime soon I think.
Admittedly not many titles lately have gotten my attention.
Why is he such a shill now?
The man will say anything as long as you pay him. He took a deal to shill some shitcoin right when the second wave of mining was in full force and it was impossible to get a videocard.
What has he shilled? He said that the 4060ti runs worse than the 3060ti in some aspects and that Nvidia is greedy for selling 8gb vram which costs $20 for $100.
He literally mocked the card and pointed all of its issues, and said "Go for AMD, This shit sucks"
He is evolving into SSETH
>replace real raster performance with meme upscaling or frame interpolation
>sell it as features
>retards will gobble it up anyway
you deserve this
I pains 399 for my 3060 ti and thought I was a sucker but I love my gpu. 8gb of vram isn’t that big of an issue surprisingly with the 3060 ti. I play tarkov at 1440 60 fps, rust 1440 90 fps, etc. My cpu is an i5-11400f though. Should I ipgrade my cpu?
I would've posted some reaction image but they dropped like $300 since I bought mine last August. Now is the perfect time to get one.
I'll just make do with a 4070 unless rumors of a super version starts soon.
Aren't the xx60 1080p cards?
Not at fucking $400 in 2023, no.
No since 2016, and specially not at that price.
>New GPU performing worse than shit already released
Impressive, I’m looking forward to the excuses used to justify this shit existing.
its cheaper to produce than a 30 series and they can mark up more from supply shortages
THAT WILL BE $400 SIR
ENJOY YOUR NEW GRAPHICS CARD SIR
still better than AMD
ah yes, the feature everyone uses for screenshots and turns off when it's time to play the games.
Ray Tracing is a meme, also the rx 6700xt costs 350 dollars, has 12gb vram and is more powerful than the 4060ti.
It is, at most, an okay upgrade for somebody who's still on a RTX 1000.
Or, just do what said and buy a rx 6700xt.
Same. I somehow managed to buy a 3060 ti for msrp when it launched, and it's been going wonderfully.
>Ray Tracing is a meme
Depends. With 4090 you can play any game with RT at high resolutions while maintaining 120FPS+ even without DLSS FG and the image quality difference is massive.
>searching for excuses to be a chud
Video game board, chud.
>>>/a/ and rot there.
Anime is the last anti chud medium, unlike videogames.
Your discord shilling will never work. Keep trying newfag chud cancer.
cartoon networks used to be cool with it too
One of the most popular anime airing currently has a fucking chud main character.
have a nice day, pedophile.
Somehow every single games doing it is fine.
You're not fooling anyone, discorder twichud.
You lost, fucking chud.
Keep trying, you a joke, groomed underage.
>still no argument against one of the most popular currently airing anime is about chud shit despite his claim that tranime is anti-chud
Preemptive acceptance of your concession btw.
unironically what chud anime are you talking about?
I think he was talking about Heavenly Delusion which shows how retarded he is because that anime is actively going against the chud narrative
>Anime is the last anti chud medium
>most popular anime right now is literal TRANIME
hey chud weeb you know that japan is a gay chud island right? all shoguns were fags and trannies nobunaga especially. first geisha were male chud prostitutes like you.
Speaking from experience I see
Video game board, fuck off pedo
Honestly, if you're going to make retarded consolewar-tier posts like that, I'm going to join the 4chantards in calling you a tranime gay every single time I see you.
gayry such as yours need not be tolerated.
He isn't wrong. Go ahead anyway.
not anymore you low iq chud. its chudsite
>no reading comprehension
>posting on the wrong website
How come nyidia is so garden gnomey with the vram?
They're literally trying to sell $20 worth of VRAM for a $100 premium. I bet these fucking cards cost less than $150 to actually produce. Their margins are beyond insane. They make Apple pricing looks reasonable in comparison.
I've never hated/wished a corporation went bankrupt more than fucking Nvidia. The most disgusting garden gnomeiest fucking slimy pigs I've ever seen.
They really are scumbags, and the number system is complicated on purpose to trick rubes into buying junk.
the worst part is the competition is non existent
they don't want professionals buying gaymer GPUs for work, so they can't make the memory situation too appealing
The cards are ridiculously overpriced
Sane prices would be
$299 4060 16GB
$249 4060 8GB
$199 4050 Ti 8GB
$149 4050 6GB
Instead you're getting
$499 "4060 Ti" 16GB
$399 "4060 Ti" 8GB
$299 "4060" 8GB
$249 (best case scenario $199) 4050 6GB
Last AMD card I had from AMD was a Radeon HD7770 and it was dogshit even for the time. That's why I dont touch AMD GPUs.
Last AMD card I had from AMD was a Radeon 390X and it was godlike for the time. That's why I touch AMD GPUs.
i forgot that every AMD card is the HD7770 with a different name, grandious logic there brah
Mate, AMD was garbage 10 years ago, even 5 years ago. There's no real difference in performance now between AMD and Nvidia unless you care for meme shit features. I bought a rx 6600 for 180 dollars and I have no problems with any games, including old games or Japanese games etc.
i've bought good cards from both AMD and nVidia and never had a bad one because i did my research before buying them, that's why i'm touching AMD and will eventually touch nVidia once they start making sensible products and prices again.
Who asked you, dipshit?
Good. Keep buying Nvidia retard.
I have the A770 Arc. Cost me $300, 16gb ram and performs the same as an RTX 3070
same gays that think AMD chipsets still melt themselves
Linus made an entire video trolling the card and snidely suggesting to "go for AMD, this shit sucks for its price tag and gimmicks aren't enough to sell for performance"
I'm still rocking a 1060 thanks to njudea, buttcoiners and scalpers.
been thinking of the switch to AMD but not sure about it. Read that even their budget cards basically require water cooling these days as they're slamming electricity through for their high performance vs Nvidia, while i've never fucked with water cooling...
When did it go so fucking wrong PC bros?
When Eth and AI slops.
We all deserves to die in a ditch.
lol, even my RTX 3080 16Gb laptop is faster than this piece of shit
>Graffix cards are too fucking big
>Need to take the mounting brackets and card out to put in a new
NVME/plug in a hard drive to sata
>Either take the cooler out to plug in a fan or slice my hand open
I miss my old shitty build bros
Just buy the Intel ARC. $400 for 16gb vram and it keeps up with the RTX 3000 series.
OH NO NVIDIA SISTERS
>nvidia is killing itself so hard on mid and low range that it started trading blows with Intel
There is the fact that matrox still works on the GPU market and stated they could return at any time, having intel and matrox team up, would be "interesting"
why would they team up instead of matrox doing its own thing?
give intel 2-3 years to cook and they will deliver. they will literally undercut amd and nvidia and have better performance.
Intel...Please...Save PC Gaming...
intel were the ones who started this shit with their CPU's and still do to an extent
>just bought a 3060 Ti in january because my old PC died and it was the only card I could find for MSRP
>was thinking of replacing it with a 4060 Ti
>it somehow runs worse
I'll just keep this until the 5000 series comes out
Why not get a 4090 at that point. you were gonna spend $500 on a 3060ti and upgrade to a 4060 ti
I spent 400 on my 3060 ti. I was thinking of selling it and just paying the difference of like 200 for a 4060 Ti, not 1200+ for a 4090 lmao
I bet 5060 ti will have like1-5% performance increase compared to 3060/4060 ti
>nvidia making skylake
Love to see it
who the fuck upgrades to a budget card from a budget card after just one gen?
I just bought it lol. I figured if the 4060 Ti was a decent enough power upgrade I'd just sell my barely used 3060 Ti for one, but since it's not really much of an upgrade I'll wait. It's not like I've had the thing for three years.
still a stupid idea
>I'd just sell my barely used 3060 Ti
Bro, are you retarded?
Used GPU prices have cratered, you'd have been selling your 3060 Ti for sub-$300.
If you're at 1080p I don't think there's a reason to sidegrade at this point, unless you're really really vram starved.
I'm at 1440p which is why I was considering it.
Understandable then, shame it turned out to be a scam.
Why not just go straight for 4070? That one actually works decent and isn't that much more expensive. Probably could last easily to 60 series.
The 4080 being 1200 bucks is the real kick in the balls.
It's the only card with specs just good enough to future proof the current gen. Nvidia aren't retard they understand this that's why they destroyed the price for that tier in particular.
This way instead of buying it for 1200 and use it for 6 years, you buy a 4070 for 600$ and buy another 600$ card in 3 years.
I like playing games on my 6800XT and seeing the VRAM usage pass the 8GB on the overlay, it reminds me of how good i have it compared to the poor nVidia users.
Maybe the CMA wasn't completely wrong after all.
PCbros... The PS5 is looking like the better option right now... Go on without me...
>40fps on elden ring in performance mode while the PS5 is 120 degrees celcius
Sure it's still not ideal, but it doesn't stutter and isn't a thousand dollars. Not to mention all of the games this year that are just about unplayable on anything less than a 4090/7900xt
>4000 series is garbage
>5000 series delayed to force people into buying 4000
I'm going to be on my 3080 forever aren't I
>5000 comes out
>even worse power to price ratio
>everyone swaps to AMD for high end cards
>Intel for low-mid cards
>Nvidia crash and die
Why do you need to upgrade from a 3080? The PS5 is barely stronger than a 2070. The people you need to feel bad for are the people stuck with a 1060 and don't want to get garden gnomeed.
I just like new tech 🙁
>tfw got a 3070 fr at msrp from bestbuy when all the scalpers were canceling their card orders due to bit coin tanking
I upgraded from a 660ti and made a whole new PC, but $500 msrp hurt my soul.
What's a good card for top tier VR?
still rocking my 2060, I guess I still don't have to upgrade yet
Fucking dying at nvidia shitting the bed this hard lately
It's starting to look like the end for them.
not with their market share
AMD JUST UNCERCUT NVIDIA YOU STUPID FUCKS
garden gnomeMD will just price it 50 shekels cheaper than garden gnomeidia, because they dont care about gpus they make all their money in cpus, just look at the rx 7900xt shitshow, i guess wait for intel battlemage or next gen if its any better
Intel are just as bad as Nvidia. Look how their shit was priced before Ryzen.
>Intel are just as bad as Nvidia
intelaviv released quad cores for 10 years but at least the price was the same and you got 20% ipc, mean while garden gnomeidia gives you worse products for the same money
I CAN'T FUCKING BELIEVE I GAVE AWAY MY 2060 TO MY LITTLE BROTHER THINKING IT'S TIME TO UPGRADE BUT THE garden gnomeS FROM NVIDIA STARTED PULLING THIS SHIT FUCK YOUUUUUUUUUUUUUUUUUUUUUUU
Take the 3060 12gb pill anon. Wait them out.
Intel undercut Nvidia harder.
the problem being New Guy grafix drivers. Oh no, AMD's drivers won't get to be the butt of the jokes any longer!
At this point, if you have 400 bucks to spend and you're running older hardware, it's literally a better idea to just buy a PS5 or Xbox Series X.
It's honestly unbelievable what Nvidia is doing here. I can't believe we reached the point where buying a console is a better bang for your buck than buying a new GPU.
Fuck you Nvidia.
>buy a PS5 or Xbox Series X
>have to pay for multiplayer
consoles are scam just look at garden gnometendo selling a rapberry pi with a screen for 300 shekels
>has mostly the same games as PC, often earlier too on PS5.
>PS5 has been jailbroken, you can absolutely pirate games for it if you're that poor.
>okay fair enough paying for multiplayer is gay as hell, so my previous advise only applies to singleplayer Chads, not multiplayer zoomers.
Consoles are indeed a scam, but less of a scam than these new Nvidia cards. You're getting fucked either way, you just get less fucked with a console at this point, which honestly has been the case since the RTX 3000 series because of crypto mining, which is still affecting the market even now (prices are still high and only coming down very slowly).
I personally run an RTX 4080 but only because I need a good PC for more than just gaming. If gaming was all I cared about I would seriously have considered just going back to consoles.
And what games do you get for those $400 spent on a console?
their cheap because it's a way to nickel and dime normies with goyslop and subscriptions.
Absolutely not lmao. The PS5 and Series X achieve 1440p and 4k resolution by rendering at sub 900p, upscaling, and then using FSR/checkerboard rendering to hit the final target resolution. Even a 3060 can do native 1440p at around 60 FPS in modern games using higher settings than what the PS5/Sex use.
Idk man anecdotally I get more even performance with new games on my PS5 than my 3080 and that's what matters to me personally. I'd rather have a consistent frame rate with upscaling than awful stutters every 5 seconds
that is literally not possible by any reasonable metric.
I don't get stutters that I notice as much on PS5. Why would that be? You tell me I guess
Which games are rendered at 900p on the PS5/XSX? Pretty sure most of them at least hit 1080p, often 1440p, which is the same as running DLSS or FSR at 'Balanced', which you'd need to do anyway if you want a 4K 60FPS experience with the same graphics as a PS5/XSX on an RTX 3060.
There are probably more but those are the ones that come to mind
>Lists two of the most horrendously optimized games ever which your RTX 3060 wouldn't run any better, probably worse actually, with more VRAM issues and therefor more stuttering.
Also I'm pretty sure the XSX runs those games better than the PS5, but I'm not sure, I don't own either console and I haven't looked at any XSX benchmarks, I just remember hearing it runs those games better than the PS5 does.
Jedi Survivor is running at around 640p reconstructed to 1440p and then upscaled to 4k at a silky smooth 20+ FPS.
You just made that up.
perf mode is 60fps not 20fps, that's quality mode.
hahahahahahah, even consoles are getting affected by shitty ports
The video literally shows the game running at 'prefer resolution mode', which renders the game at 1440p 30FPS (though often with dips to 20FPS). While 'prefer performance mode' renders the game at 864p 60FPS.
I don't know if you're just misinformed or disingenuous, but whoever made your vid rel is bullshitting you.
Either way, Jedi Survivor is garbage anyway and I wouldn't wanna play it on any platform.
You are being more disingenuous. it isnt 864p but as low as 648p, so the overall experience is prob closer to a constant 720p. And it can’t even hold 60 fps, lots of dips below 50. So the real performance experience on console is 720p 50 fps, which is even more shit of an experience than what you said.
yeah but at least it's consistent and the performance can in theory be fixed by lowering the graphical fidelity, optimizing the assets or effects or whatever or just enabling VRR because 50fps is not that noticeably jankier than 60fps and with patches they'll eventually do it
the problem is that the games are rushed or trying to push the graphics too far and developers are intent to fixing it (eventually)
on the other hand the problem with PC ports is systemic - devs are notoriously neglecting to have the game generate shaders on first launch and this won't change because devs don't see a problem with it
ok you bought the console, how do you now pay for the online or games etc.
Series X is the best value
>use your old discs
>download retroarch, ps2, Dolphin, ppsspp, one day maybe even ryujinx
>plays newly released AAA slop without installing 45 launchers
>GamePass for AAA slop, play it and throw it away
>quick resume even for 360 games
>upscale, upframerate 360 games
This thing can play every kino game there's been and the only argument pc has against it is "but muh obscure dlsite dicky game!!" for which you can just get a 100$ used thinkpad. Ps5 has nogaems
>in 4 but idk how to do dev mode
>PS5 has no games
>Series X literally has no exclusives aside from those grandfathered in
Say what you want about Sony's first party output but at least they care about making their product marginally attractive
>b-but muh exclusives!!
Irrelevant. 5 (4) games aren't enough to justify dropping backwards compatibility all the way back to MSX games and all the way up to 360 games. I rather play Ico than Ratchet & Clank: Cutscene Fest
Bro you can emulate Ico anywhere. What games in the Xbox library are worth that to you genuinely curious?
PS5 is getting new AAA exclusives this year
>Bro you can emulate Ico anywhere
Not on a PS5 you can't
>What Xbox games?
Don't try to make the argument of exclusives vs exclusives, none have enough to justify a purchase on their own. Xbox has better features like Quick Resume and backwards compatibility, and it can take the spot of a PC for most vidya uses. PS5 is basically a glorified PS4 PRO2 that's both massive and ugly looking. 6 months of FFXVI ain't going to change that.
If you want PC features you should buy a PC. PS5 is the better option if you want to play new shit because it'll play everything plus whatever exclusives
>6 months of FFXVI
This is cope. They are aiming for like a two year exclusivity window
I still own my 1080TI that i got as a gift in 2017
I expected next gen to be bad, but not THIS bad like holy fuck. Duopoly blows but intel cards are nonexistent where i live.
t. bought 6700xt just in case
1060chads we still hodling it seems
I'm not HODLing shit, this piece of shit card can barely run 5 year old games on every single setting set to lowest at 1440p
Fake news + new games are shit anyway if true
Seems like a poor choice of browser problem then
I vill NOT upgrade, morongarden gnome
Enjoy your Jerry Rice and Nitus Dog Football at 480p brother man
>Seems like a poor choice of browser problem then
No. Anything using HW acceleration will consume VRAM. Granted in my case it's just more browsers (Spotify, Memecord, Steam).
If you are going to tell me I should use my computer less in some ways that would be a massive cope. Under extreme conditions where it's necessary, like the 1060's 6GB, then sure if you want to deal with it close every program to run a game.
not that anon I use my laptop hooked up to a second monitor and run the audio into my desktop's line in port to take the load off my desktop.
Sounds weird but it works.
At the rate we're going, I'll still be using a 1060 in 2030.
I kept running into VRAM limitations, 1440p didn't help, so I respectably upgraded to a 3060.
With a browser open in the background 1GB of VRAM would be reserved by default.
You're reaching poorfag cope status friend
Probably going AMD/Intel in 5 years as Linux support rapidly improves and they actually compete for price/$.
I already use Linux, but the proprietary nvidia driver hasn't given me any reasonable headache yet.
96.5 FPS at 1440p.
This is good and not bad.
It's literally not even 10% better than a 3060 Ti.
It's a great update for people with a GTX1070.
just buy an RX 6700 or 6800
Why should I buy a AMD gpu if I don't use Linux?
new gen AMD is DOA, RTX 4000 series is a waste of silicon sold at scalping prices, Intel is a meme, the only viable alternative is buying last gen AMD, they perform similar or better than the 4060 (see
) sold at reasonable prices or cheaper.
Also "le bad drivers" is a meme and hasnt been true since the launch of RX 6000
>Trying to trick people into buying AMD so Nvidia is forced to get their shit together
Based. Feel bad for the retards that take the bait though
He's right though, you can buy a rx 6700xt for 350 dollars, which has 12gb vram and is as powerful as the 4060ti, which gets bottlenecked by its 8gb vram.
AMD has better value graphics cards than Intel does, though.
>AMD has better value graphics cards than Intel does
wrong. A770 is on par with an RTX 3070 for $300
No, the rx 6600xt which is $250, is better than the 770.
>the rx 6600xt which is $250
show me where you can get one for that price.
Not him but the first result on google shows one on newegg for 250 with a 10 dollar off promo so 240.
oh, the market went to normal. Might cop one, thanks.
AMD cards have been like this for a few months now thats why nvidia shills are out in force
best cards are currently as follows
RX6600 $199 for 1080p/80fps+
6700XT $249 for 1440p or high 1080p
6950XT $649 4k or high 1440p
4090 $1599 4k, actual rendering and ML
3070 here and not even bothering with memetracing and dlss since games I play all run 1440p/100fps+
is the 3070ti worth it? From what I see, it's not much performance increase and it runs a lot hotter. Is there something I'm missing?
God I wished, Ever since 4xxx card came out my country jack up the price because of the low VRAM meme and fucked up Nvidia MSRP pricing.
It almost reach $290 in my country last july and now it rose to almost $420.
I feels like a fucking retard for being a waitfags.
meant $340 for 6700XT my bad
yeah if it was $249 that would be only card i would recommend
wouldnt bother with 3070ti would just grab the 4070 for that price if really want to go with nvidia
even though its fucked what nvidia did with the 4070 its still decent
wish they had priced it $100 cheaper
At this point you can even get 6650XTs for around that price.
>, which gets bottlenecked by its 8gb vram.
>Techpowerup used RE Village for ray tracing test instead of RE4 Remake
The features like DLSS and ray tracing are going to matter more and more though. It might not be that big of a deal right now but as devs get lazier and lazier stuff is going to be designed with that in mind and a card that excels at raster right now is going to get left in the dust
>The features like DLSS and ray tracing are going to matter more and more though.
Not for any current GPU, by the time raytracing really starts to matter it's going to be unplayable on every single GPU currently on the market, even at 1080p.
Ray tracing you are probably right, but DLSS is already pretty much mandatory if you are aiming for 4k on something less than a 4090
>but DLSS is already pretty much mandatory if you are aiming for 4k on something less than a 4090
Pretty much anything beyond a 3070/4060Ti/RX 6800 is going to be capable of 4k60, DLSS/FSR is only necessary at that res if you're aiming for 144+ Hz.
No it's not, I had a 1070 and I upgraded to an AMD GPU instead because Nvidia's price-performance fucking sucks ass nowadays.
No way in hell would I pay $400 for a GPU with the same amount of memory as my old 1070.
It's a shit upgrade for 3000 owners but not too bad for anyone that hasn't upgraded yet.
I had my eyes on a 4090 FE but I'm just gonna buy a PS5 and an irokebijin e-boi doll instead because nothing actually takes advantage of all that performance, it's just necessary to account for devs being lazy retards. Feels like subsidizing H1B pajeets
the 4060ti is not a good upgrade for anyone. 8GB is a joke.
the 4060ti 16GB might be, but at a cost premium.
At this point I'm just gonna upgrade my 1060 to RX 6600 just because it got similar watt and wait another 3 years,
I said 3 years because there will be no next gen Nvidia card anymore until at least two years because these retards try to scam their consumer and cause massive stockpile of shitty overpriced GPU.
>mfw i got a 3080 for MSRP the minute they launched
I was blessed. One time.
Honestly this is the most golden opportunity for Intel to undercut the absolute motherfuck out of Nvidia and AMD and steal a giant chunk of the GPU market because both of them have gone pants on head retarded mode and think people are going to spend 500+ for an xx60 card or equivalent.
Remember when 60 were the budgets card, 70 were the midrange and then 80 were the flag ships.
Why do they all cost stupid amounts of money? 60 should be no more than $250, 70 should be $350-$400 and 80 could be $600. It's so retarded that it's just accepted now that a GPU costs x2 it's value.
This isn't a xx60 cards, its a fucking xx50 card sold at xx70 price.
Why the fuck it got similar performance as the old gen card with the same name but also increased MSRP?
Because they introduced the 90 series and ruined the entire hierarchy.
>the editor evade all the questions
thank Allah (pbuh) for the 3080ti, the greatest card to grace his holiness' realm
I bought a 6800 xt and there is a noticeable background buzz when running certain games
you whine more than your coils
That's coil whine. it usually happens when you have stupidly high frames in a game or it could just be unlucky badge. If it bothers you that much, return it because it won't go away.
Try capping your FPS, if it doesn't go away, consider returning it.
Coil whine scales with core amperage so if you undervolt then the noise reduces. You have to find the point where it doesn't whine anymore at a certain clock/voltage range.
some cards will just do it anyway though.
What the shitting fuck do I get for 1440p
grab a used 2080 super or some shit
Go AMD and get an RX 6950 XT, you'll pay only a little more than this garbage RTX 4060 Ti while being able to game with super high FPS at 1440p, and you could even do 4k gaming on that card.
At 1440p you might even be able to turn on raytracing and still get good FPS, while at 4k you probably want to keep raytracing off with the RX 6950 XT.
Even a fucking 6700xt is better than this housefire.
Wait what the fuck happened here?
I thought it was weird when the rest of my sub box was shitting over it, J2C's video was praising it, and Jay wasn't even in the video.
what's a sub box
it's where I keep my lunch :3
garden gnomevidia shill
Their RTX 4060 Ti review was a positive one and they said it was a good card and a great upgrade for RTX 2000 users. They didn't really point out most of the negatives of the RTX 4060 Ti.
Sufficed to say, their viewers felt like their review was just shilling Nvidia garbage and lost all trust in them.
GAMERS RISE UP
>Gets rich by endorsing mining and pushing miners during gpu drought
>says a 970 is fine for gaming in 2021
>his social media is him building his expensive housing off miner money
I wish a moron would rape his wife and children but he would probably enjoy it.
Can't take him seriously anymore anyway
I got 3080 recently for like 530$, how did I do?
very good, assuming it's the 12gb version
1080p, you said?
I took the gaming laptop pill since desktop gpus keep getting more expensive and laptops keep getting better and cheaper*
just play older games, until you finish your backlog price will drop. there s so much shit to play anyway, gamecube games, old pc games, whatever the fuck. these new games are not particularly good, play older games and the time you finished your backlog the card price will be cut in half and the price of games will be cut in 4
It's fun to play stuff when it comes out to be part of the discourse.
I'm gonna upgrade to RX 6600 cause it price only US$180 in my country to replace my 1060 so I can finally finished my backlog of Ys games at 144 fps.
Just bought a 1080p 144hz monitor because my old one is start showing it age but the 1060 can't handle some old game at high setting at 144 fps.
Not gonna fall into 4k monitor meme yet because how retarded the modern gpu pricing is, I won't fall into the meme scam.
you re joking right
I'm playing Celcata and I need to turn down some setting just so I can play it at over 90fps with my 1060.
why dont you play it at regular framerate, you need a new gpy to play rpgs at 144? that makes no sense to me but whatever its your money
I mean, 1060 is an old card and I can sold it for at least $80.
Another $100 is nothing just so I can play my backlog at so it can matched my monitor fps.
> regular framerate
After noticing the differences between 60hz and 144hz, regular framerate feels so choppy for me.
Also I rather sold this old GPU at proper price before it died on me and get a new GPU with new warranty.
>60fps looks like dogshit
>144 fps feels like seeing 60fps for the first time now
>30 fps feels like a power point slide
AAAAAAAAA I RUINED MYSELF.
4k monitor even worst, if you got shit GPU now you playing on shitty blurry shit resolution so you can get proper fps.
thats fucking bullshit, I usually play at 60 but sometimes for more cinematic games that look good and with not too fast paced gameplay like FF I lower the framerate on purpose to lik 40 fps to make the picture look richer, some other games sometimes 50 fps, and you absolutely get used to it pretty quick. this "ruined myself forever" thing is total bullshit, its relative
Everytime I set old games to 60fps, my eyes get raped. Turning the camera is like flipping through a notepad cartoon
I dont believe you, but regardless of course you feel the difference because of the relativity. The first time you launch a game set FPS at 40 or 45 and play like this for an hour. Then turn framerate to 60 and it ll feel good again. You feel it because your brain isnt used to it, its all about the contrast. Hrad limit (meaning regardless or relativity ) is like 30 fps, you can definitely tell. But from 45, to 50, to 60, after a few minutes you dont even think about it anymore. Sometimes you even want a lower framerate to make a cinematic game look fuller
I remember raiding in WoW with 10-15fps, now anything dipping under 100 makes me worry.
1080p60 chad reporting in
The only reason 60 FPS feels like dogshit is because modern games are poorly made.
60 FPS feels silky smooth if you have a flat frame timing graph, but if your frame timings are all over the goddamn place (which is often the case with modern games) a 60 FPS experience can feel worse than a 30 FPS experience.
With games like Hogwarts Legacy you really do indeed need ~120FPS to make the game feel smooth.
I'm not playing modern game tho.
Dunno why you quote me.
All modern games are not worth playing, this is why I choose to buy RX 6600 soon.
ehhh not really I have a real time frametime graph with rivatuner and modern games don't suffer from poor frametime consistency
60fps in old games feel equally shitty if they have high speed camera movements
60fps in VERY old games like SNES feels great because the screen scrolls slowly compared to how quickly the camera moves in 3D games
you are also forgetting that you will never get perfect framepacing unless you are using FIFO present method, if you did your test with v-sync off you are just deluding yourself
you probably just have some shit old hardware that suffers from massive frametime spikes and is simply fast enough for older games
or you have amd which is notorious for having shit frametime consistency commonly known as the AMDip
>ehhh not really I have a real time frametime graph with rivatuner and modern games don't suffer from poor frametime consistency
Go try Hogwarts Legacy, The Last of Us Part 1, Jedi Survivor, or even The Witcher 3 Remastered, and tell me more about how modern games don't have poor frametime consistency.
>60fps in old games feel equally shitty if they have high speed camera movements
Such as? Which games?
>60fps in VERY old games like SNES feels great because the screen scrolls slowly compared to how quickly the camera moves in 3D games
The reason SNES games feel great is because they typically run at a very consistent 60FPS with consistent frame timings. Though in my experience most SNES games often have insane FPS dips in certain situations, for example when doing more visually impressive attacks in Final Fantasy 6.
>you probably just have some shit old hardware that suffers from massive frametime spikes and is simply fast enough for older games
>or you have amd which is notorious for having shit frametime consistency commonly known as the AMDip
Pic rel, my hardware.
Still waiting for FSR 3
FSR2 is shit comparative to DLSS upscaling and that is not due to a lack of AI stuff, AMD just can't make a proper temporal upscaler. UE5, while kinda shit overall, has TSR which shits on FSR2.
DLSS Frame Generations uses a fixed hardware block. What makes you think AMD can make FSR3 look good WITHOUT that if they can't even get FSR2 right which is a much simpler concept?
4090, 4080, 3090, 3080 Ti, 3080 12GB, 6900 XT, 6800 XT, 6800, 3060 12GB are the only cards considering right now at their current prices.
You forgot the RX 6950 XT. Unless the prices for that one went back up again? Last time I checked it was the single hest bang-for-buck card on the market right now.
I just treat the xx50 RDNA2 refreshes as the same cards because they essentially are just binned cards with the software memory frequency slider unlocked which is kinda scummy because even garden gnomevidia refreshes use different GPU SKUs and they don't lock memory overclocking on any card, the only limit is the memory's silicon quality.
No, if you're spending that much you may as well go for the 4090
God no, 3090 prices are shit, even used they're going for $700+.
Basically the same GPU, if you can find them for sub-$650 then sure.
Sure, assuming you can find the 6900 XT for sub-$600 and the 6800 XT for sub-$550 (both new)
Ehh, borderline, at sub-$500 it's not bad.
Honestly, if you're looking for a 1080p card with some light 1440p dabbling, the 3060 12 GB, 6700 XT, and A770 are basically the only options that make sense recommending.
>1080p looks blurry on a 27 inch 1440p monitor
>No, if you're spending that much you may as well go for the 4090
I agree and that's what I did but technically the 4080 has a slightly higher price/perf and some people are within a budget where that $400 is better spent elsewhere. A 4080 + 7800X3D/13700K would be better than 4090 + 5600 and the patented Zen 3 Glue Dip.
The card was made to be used with DLSS3
Benchmark results without turning on DLSS3 and frame gen is disingenuous. The end user will turn them on, and with it they'll get double the FPS of a 3060 Ti.
>this moron fell for the frame generation meme.
Nvidia has you hook line and sinker.
It's not a meme anymore when everything runs at 20fps with it off
It's even more of a meme then, as that's when frame generation is at its worst.
Also wtf are you even doing to make your games run at 20FPS? Just lower your graphics settings, lmao.
Funny thing, it barely works in new games. Running on low and ultra gives almost the same fps most of the time. Shit is fucked.
every game still runs on the GTX1060 you wazzok
You might as well plug a banana peel into your pcie slot
gee, it's almost like the GPU market is completely fucked and people refuse to pay $600 for a midrange card thats worse than the last gen
The GPU market is fucked but if you are still running a 1060 you are taking the waitfag meme too seriously bro. Are you just never going to upgrade again until a 2X upgrade is so worthless someone will give you it for free?
nah, why upgrade. 1060 runs most games fine and new AAA releases are dogshit in optimization anyway. It's a great 1080p/60fps card
Bro... It hasn't been a 1080p/60fps card for half a decade
Like he said, that's only true if you want to play unoptimized AAA garbage.
>turn off useless settings like depth of field
>dont put useless shit that dont change anything on ultra
>limit framerate at 50
literally still running every fucking thing
>literally still running every fucking thing
Except all that undefended (AAA garbage) you admitted it won't run lol
I can get a lot of frames out of Full Tilt! Pinball with a card from 20 years ago so I guess I can call that a 1080p 60 card too
the fuck are you talking about i just played witcher 3 deus ex mankind divided and mad max and had no problem whatsover
It's really just Nvidia prices that are fucked.
$600 on the other side gets you a GPU that tickles the ballsack of the 2nd-strongest GPU in the product stack.
Even the 7900 XT is going for as low as $700 nowadays, which isn't amazing or anything, but for a high-end GPU in the modern market it's basically a steal.
>as low as $700
I miss 2010s. I remember $400 could get you a flagship gpu
Yea, blame the retards who bought Titans for $1k back in 2013.
I miss the days of getting an x80 GPU at launch for $500 or less.
in amd's and garden gnomevidia's defense, these cards are also more expensive to manufacture than they were in the past
in some tiers the margins have stayed the same, some have even shrinked
FG is not a meme. I use it on a 4090 even though I don't need it because at ~175 FPS I can't tell the difference between native and FG even if I look for it and the GPU uses like half the power.
Using FG to go from 20 to 40 FPS is retarded and artifacts are insane but something like 90 to 180 FPS is basically a free 2x boost. The only downside is extra 5ms input latency which I can live with. With Reflex even with FG it's still lower than AMD native lmao
>Benchmark results without turning on DLSS3 and frame gen is disingenuous.
More like, Nvidia's attempts to hide how hard they're skimping on specs to raise their profit margins are disingenuous.
whats the point of raytracing except for the two top cards that can actually run it? wanna run super mario in raytracing?
>tfw I'm playing DRG on probably fairly old gaming laptop with equally old graphic card I bought for fairly cheap
How about you stop enabling "AAA" slop you fucking retards?
I'm interested in seeing hybrid upscalers. DLSS + FSR3 frame gen on arbitrarily-limited-30 series.
>NoVRAM: 128-bit bus is fine because tons of cache!
>GAYMD: lol nice we an sell our hot garbage at no discount now!
Intel legitimately has a chance. Never thought I'd see the day
: lol nice we an sell our hot garbage at no discount now!
But AMD has the best value GPU's on the market right now.
They also have no drivers and randomly crash to black screen for minutes at a time on the most standard of games.
I have had a 7900 XTX since release and I've yet to experience this, this is the games I played.
Also FFXIV and some other shit on EA and Ubishit.
>works on my machine
keep replying, you're a shill and it's obvious. RX 6600, first AMD card in over 10 years. Immediately crashed to blackscreen.
On what game I have no idea what you played it's possible. My 1070 GTX kept crashing on Mon Hunt World for 6 months before they fixed it.
I telling you my 7900 XTX is running all the games I have posted here with 0 issues. And I mean literally 0 issues. I assume this is mostly because all of the games are old.
I don't care, I have it for the ecosystem. I already own every game in existence because I can pirate.
just because you can't see it doesn't mean it's not there
my 6900 xt had problems with almost all dx11 games and stuttered often
tried reinstalling drivers multiple times with DDU, AMD's official uninstall tool and nothing helped
I fixed it by switch to nvidia
What were these so called dx11 game?
>so called dx11 games
Are you really that much of a retarded moron to ignore how many games still use DX11?
AMD itself acknowledged this and has rewritten their entire DX11 driver https://community.amd.com/t5/gaming/stability-performance-and-great-experiences-with-amd-software/ba-p/530424?sf258519940
for 1.5 years before this it was fucking miserable but yeah it's great nvidia fixed their fucking $1000 GPU to not stutter like crazy 6 months before the new generation
opengl driver was even worse and also got a complete rewrite
but yeah, it's fine to be treated like garbage for this long
AMD: just wait (TM)
sounds like malware
Yea, I'm on a 6950 XT and I haven't had any issues either.
I'm convinced that these people are genuine shills.
huh? what gpu do you have?
so integrated graphics?
hahaha, JayzTwoCents tried to shill this card positively but when he saw that everyone else was shitting on it he deleted his review from youtube
I hated this gay so much for years now.
I distinctly remember when the "adpocalipse" happened he was moaning about how he not making any money and how poor and he can bearly scrape by meanwhile he fucking bought a sports car, then he defended himself it's only a "40k$ car thats not much"
Fucking unreal how I hated him since that.
something about fat nerds having long hair triggers me.
>there's no problem with a $400 card using a 128 bit memory b-ACK
>sells you a rebranded gt 4030 for $400
nothing personal goyim
When will you fuckers stop backing Nvidia no matter how much proprietary lockdown, bullshit non-standards, overpriced nothing and "oops hand caught in the cookie jar" they do? The ONLY Nvidia card worth buying is the 4090 (ironically its also the least overpriced) if you have $1600-2000+ and are willing to spend every cent for massive performance. Anything else, you're better off with AMD (or at the low end 1080p or video encoding, Intel ARC). 7900XTX has between 4080 and 4090 performance at rasterized gameplay, does RT decently, and has 24gb of VRAM , costing almost HALF of the 4090.
>but but but mah ARRTEEEEEE
7900XT/XTX is decent though not as raw powerful at RT as 4080/4090 at 4K RT . However, except for maybe the 4090 in certain circumstances, turning on RT is going to mean having to either lower other settings, use DLSS/FSR/XESS, or lower resolution. Most RT stuff is either a oohshiny not impacting anything with a huge performance cost anyway, and enabling it with FSR/DLSS will be fine
>but but mah DLSS
NV sucking off proprietary standards again. FSR and XESS are open and platform independent and work equally well. You can put some bad implementations of anything including DLSS or point to a freezeframe comparison cherry picked, but the vast majority of the time they're equal.
If you needs it for work and you're stuck to it, that's one thing but otherwise your'e only emboldening NV to get yet another proprietary standard in the hopes to lock people in .
>but but DRIVERS
Mostly an outdated complaint. if on Linux, not just entirely outdated but superior to everyone and everything else thanks to AMD FOSS drivers. For Windows, they are about equal to NV as far as fucking up or lack thereof, proprietary drivers go.
Nvidia will never stop lockign shit down and putting huge pricetags so long as you fags excuse whatever they do and just buy their shit. Either buy 4090 , or go for AMD/Intel.
>For Windows, they are about equal to NV as far as fucking up or lack thereof, proprietary drivers go.
The delusions of fanatical shitMD cultists know no bounds.
Is this a case of
>WAHHH I HAD DRIVER PROBLEMS EVERYONE BAD HOW COULD THIS HAPPEN TO ME
Some sort of selective blindness where it only matters if it AMD? I've built and run multiple systems for others and myself over the past decade or so and in the modern era Windows drivers for NV or AMD are both "fine most of the time, until all hell breaks loose" due to software/game update or some sort of other incompatibility. I troubleshooted ATI back in the bad old days, I remember when bad windows drivers wasn't just a meme.
Something is wrong here. Smells like driver issues. I suspect Nvidia will release a driver that gets 20% more perf over the 3060 Ti.
Cope. Not gonna happen, bro.
AMD cultists are worse than snoys. Absolute subhuman filth.
>Nvidia openly tries to garden gnome you, again
>"but AMD suuuucks!"