outed yourself as a modern "AAA" slopdrinker
1080p has been fine for the actually GOOD PC games made anytime in the last 15 years and no current game even approaches the merit of benefitting from a higher spec
instead these profligates spend $3,999 on hardware just to run snoyslop remaster remake port of jedi shitshow at buttery-smooth 24fps in glorious 1440p when you could just replay one of the hundreds of worthwhile games from the last 10 years (or good retro emulators that hardly even need any graphics card at all)
2 weeks ago
Anonymous
If you only play the so called good games then 1440p is not too hard to run. When I got 144hz I went through so many older games and they all felt so good.
That's because optimization is the last step during development, and so execs just straight up remove that step nowadays to meet the deadlines. Plus, a lot of codding is now done by cheap third world labor and dumb kids straight out of collage so that kind of people couldn't optimize for shit even if they tried.
This. The only thing that can save gaming is economic collapse or Indie Dev stuff exploding due to chat GPT and other AI stuff creating an exponential jump in the work teams of fewer than 10 people can put out.
they make more money by convincing people they need a console for the optimal experience. People who build their own pcs are an incredibly small minority. If franchises like CoD sell millions every year apply that IQ level to every avenue of media and you will understand really how braindrad retarded consumers are that they actually believe a console is worth the money compared to a midgrade custom pc that can emulate any game in history at better performance and resolution but putting in new hardware that literally clicks in place like legos is still too difficult for them to comprehend.
This is true to an extent, some indie games and messier console ports will drag you down pretty bad even with 12 GBs. Everyone knows Jedi Survivor is a mess, and in my experience Hell Let Loose will drop to single frames running at 3/4s of max settings. But from friends’ experience and from what I’ve read 8 GBs is good for 80% + of gaming
>8 GBs is good for 80% + of gaming
this is truth for the next couple years.
i wouldn't ""upgrade"" to another 8gb card but if you've got one you're fine. if you needed a new card and are short on cash, 8gb is fine but don't get the 4060 because it's gimped.
it's easy to cope when my 1070ti ate elden ring for breakfast. i'll shell out some big bux in a few years and get a high end 50xx or 60xx.
if my card died right now i'd get a 3060ti
at some point people will just get a console because nvidya is going retarded with gpu prices
no, i'd rather have a trash card and continue to game on pc
modern games frequently eat up to 16gb or more of ram, and even if they didn't the more room you have to breath overhead the better for performance it is, also posting in an advertising thread
>turns out you can run "next gen" games with 8gb vram when you actually put effort into working a release
you can't even emulate ps3 games at 1080p with 8gb vram
>d4 beta >people were reporting memory leaks with gpus with 12 gb on ultra textures and in rare cases gpus were getting bricked >start sweating profusely >try the beta anyway despite the risk >all settings on max except blur and other eye cancer inducing shit >a sudden relief came onto me >I've been an amd chad with 16gb and on linux and couldn't be affected anyway because I didn't fall for leatherjacket man's lies
feels good man
>$100 for $15 of ram
That’s not all your paying for.
Your paying for the GeForce downgrade program(tm) where they downgrade textures and introduce pop-in.
Already happened on the updates that “fixed performance” in hogwarts legacy and TLoU.
So in a round-about way, even by buying the card with an adequate amount of memory, you’re still paying to maintain the status quo of VRAMlet performance.
KWAB.
They aren’t bugs they’re literally updates made months after the game is out.
You’ll notice the VRAMlet card (on the left) gets “higher fps” but the experience is shit.
That’s a feature not a bug, meant to fool idiots who only look at fps numbers on a spread sheet to decide what’s good.
2 weeks ago
Anonymous
Well maybe you are not pretending. Maybe this really is your first year of playing video games. And maybe you really don't know how good textures on even just 1 gb can look like.
For 330 bucks you could have bought an 8gb card in 2015 (R9 390). 4060ti is more than that and we are still stuck with 8gb even 8 years later (which is more like 80 in tech world)
anon, the situation will get worse, not better
sooner rather than later, some random indie games will require 12gb vram >but in a perfect world devs would optimize their games!
cool
No excuse for what? There's literally always been two versions of xx60 cards with different VRAM
1060 had 3GB and 6GB
3060 had 8GB and 12GB
4060 has 8GB and 16GB
Like are you retarded or what? This is no different
the 4060 ti sounds weak for a $500 card in 2023 based on its specifications
the quantity of vram it has is nice but it seems like it would shine in AI as opposed to vidya
not just bad ports but also drama queens on youtube that push this narrative becouse it gives them click.. or they just want to shill amd.. With i would have nothing againts with amd was not just as scummy as nvidia. The good guy vs bad guy in the market is always completely silly.
not just bad ports but also drama queens on youtube that push this narrative becouse it gives them click.. or they just want to shill amd.. With i would have nothing againts with amd was not just as scummy as nvidia. The good guy vs bad guy in the market is always completely silly.
Killzone 2 would be impossible without at least 8GB of VRAM and 16 RAM these days. Why? The developers. Why would you subsidize incompetent development by demanding the ability to purchase a video graphics adapter with at least 16GB VRAM? AMD shills do not understand this.
I remember it was giga impossible to play FPS on console, so I just cheesed with that spy classes camouflaging and shanking people from across the room with insta kill lock on. The assmad was unbelievable carrying those games
>AMD shills do not understand this
You are just outing yourself as an Nvidia fanboy. "Hurr Nvidia is right they should give us less VRAM!". have a nice day.
It ran at 720p but they advertised that it would release at 1080p. Snoys also expected 60fps and it would kill crysis as the king of graphics with the almighty CELL
Most modern AAA releases on PC are unoptomized and older cards in the same class released with more VRAM or for far less money. Gimping vram on next gen cards isn't going to change how developers make games.
>Killzone 2 would be impossible without at least 8GB of VRAM and 16 RAM these days. Why?
Because Killzone 2 wouldn't target 720p with low res textures.
It's funny you're using Killzone as an example. A shitty looking game with tons of particle effects thrown in to hide the garbage textures and low res rendering resolution.
Iron Galaxy, the company so fucking bad that it's now become notorious for shitty ports. But someone, somewhere, keeps managing to get them business. How, I can't tell you. Even if they were cheap surely the followup costs to fix the fuck ups for months would negate that?
>How, I can't tell you.
It's probably a case of their services being cheap and companies not bothering to get a good porting company if they can pay as little as possible
Textures looking like they traveled to 2007, no this is worse.
OG crysis and STALKER games has better textures than that pos, and they only needed 512MG - 2GB cards.
It takes manual extra work to make 8gb cards work. What happens when next gen nvidia does up the vram amount? You can kiss that extra dev effort goodbye. An 8gb card is a ticking time bomb that will be broken on release in many titles and may not ever get working if the devs don't care enough. Having less vram than the consoles that games are made for is a bad fucking idea no matter how hard shills try to lie to you about it.
expect this narrative is false and again only show that people do not have opinion on their own and just follow the herd or what they favorite youtuber said instead of doing own research about the problem.. consoles do not have dedicated 16gb vram just for rendering but also for system that takes 4gb of it and some of the vram also need to be shared with ram.
The textures are made for 11 to 12GB vram. Then they just put them into some automatic tool to resize them for mid and low settings. Like it can be some shitty photoshop action macro you can make in 30 minutes and leave it to run on thousands of files.
The results are shit you can see in all of those games that are benchmarking high vram usage with high and ultra textures.
They literally are clueless morons adding prefab assets into engine dev kits.
There are no work hours paid to any of them to do any sort of optimization.
Console OS reserves around 4GB of memory for background processes and whats left of the shared 16GB is what they aim at and use.
And its like you are one of the moronic shills that knows exactly what I am talking about but is intentionally playing to be a retard.
Devs do not even try to use system ram more or to optimize the flow of how and which assets are loaded so you can have 64GB of fastest ddr5 and its worthless if your shitty ass game runs like shit ona gpu with just 8GB of vram.
They do this intentionally. Thats how the top down corporate orders are from the garden gnome shareholders.
moron gays garden gnome tool.
Dude, you bought an 8gb card because you're poor or just made a stupid mistake, but stop trying to throw people into the same hole you fell into as a coping mechanism.
Anyone reading this thread, DO NOT BUY an 8gb card, you will regret it.
>Hogwarts runs fine on my rig
It was heavily patched to "better" stream textures. They still take a very long time to load and until they do they look worse than a PS1 title. This is why it runs so well now on low VRAM. Before the patches the game was stuttering like mad because it loaded too many high resolution textures at once.
>So it's an optimization issue, not a "low" vram issue.
Partially, yes. But there's only so much a game can do to fit all its assets in VRAM. Texture streaming on low VRAM will always look bad.
2 weeks ago
Anonymous
>Texture streaming on low VRAM will always look bad
So just load in textures beforehand? Everyone has at least an SSD these days so there's no excuse for devs to not be able to implement this.
2 weeks ago
Anonymous
>So just load in textures beforehand?
No space. Will lead to stuttering because of VRAM running out. Texture streaming is a necessity.
kinda unrelated but if you like generating AI shit with stable diffusion you'll want to max your vram and as someone who thought he'd be on a gtx1080 (8gb) for a decade im now seriously looking at upgrading because it's so fun to generate pictures and the 1080 is at a bad spot where the next gen literally takes half as much time per iteration
Actually it is more close to the 2060 with no super and RX 6600 XT
2 weeks ago
Anonymous
>Actually it is more close to the 2060 with no super and RX 6600 XT
and yet it will give x2 performance of a PC with 2060. even 4090 struggles to run latest ps5 ports smoothly while ps5 has no issues. PC gaming is a meme.
>consoles are 6700(xt)
lolno?
Their real world performance is closer tot that of a 5500xt. Their games also only run on low-mid settings with upscaling out the ass.
please direct me to an entry level card
last i checked it was supposed to be the rx460 but they didn't actually exist
and dont even get me started on the gtx1030
>only aaaa game that might be worth playing on the horizon is starfield >no reason to upgrade for it before significant mods are out in a few years
You are genuinely retarded if you buy a 40 series card.
>you need more than 8 heckin vee ramerinos >what about all the frickin' NEW SONY PORTED GAMES AND UBISHIT TOWER SIMULATORS! DON'T YOU WANT TO BE ABLE TO RUN ASSSCREED 1000 WITH MEGA GRAPHICS AT 16k
I've bought three games in the last 5 years, why would I give a fuck?
>please bro buy the 16GB bro you really need it bro trust me bro >why? >to play the latest AAA slop of course
I haven't played a single (non-indie) game that released in 2022 or 2023
>nVidia is healing.
not good enough. a xx50 used to be 200, a xx60 used to be 250-280, a xx70 used to be 320-360, a xx70Ti used to be 400-450, a xx80 used to be 550-650, a xx80Ti used to be 700-750. now those cards are priced relative to their power. so a card that delivers great performance at 1440p today has the same price as a card that delivered great performance at 1440p 2 gens ago. gone is the "oh just wait a year you're going to get something better for cheaper", now it's the same price for the same performance just one card tier lower. in a couple years we're going to see xx50 cards that deliver 4k performance that cost as much as a current gen 4k card.
msrp prices are only for the americans. for europe you'll get taxed like hell
2 weeks ago
Anonymous
Americans always forget to mention they get taxed as well but it depends on the state how much you pay in taxes. For some reason they don't include tax in the sale price, they only add it on top at the counter/checkout which is pretty scummy but hey they Americans love emulating gnomish practices such as mutilating infants and having mandatory tipping at restaurants.
>fake frames that magically doesn't work on previous gen tensor core >even with fake frame its only 30% more on average >without fake frames its 10% on average
...and Nvidia think that its worth paying for shills to convince people to buy this shit lmao.
Seriously are you for real? You have a better card than me (I'm on a 6600) and I make like $3k/month.
I think even third worldies make more nowadays.
2 weeks ago
Anonymous
Yeah, no joke unfortunately. Just got really unlucky in life and currently can't even get a better job. PC is literally the only thing I spend money on, even my clothes are super old, I need buy some this year I guess.
2 weeks ago
Anonymous
No, we don't.
that's nearly a dream wage lol, only some seniors earn that much.
So it's compatible with the feature that introduces artifacting and input lag, and is only even remotely playable at high fps which the 4060 will not be getting. Amazing value indeed.
cope, i bought a nvidia gpu and it simply works better than amd and dlss simply looks best
>dlss looks better than native
the guy even says its not noticeable in motion and specifically only has criticism of frame generation (again hard to notice while gaming), looks perfect to me
2 weeks ago
Anonymous
>upscaling shit looks better than native >input lag imperceivable
Get cancer garden gnome tool
2 weeks ago
Anonymous
Stop kvetching, goy.
2 weeks ago
Anonymous
whats the alternative? amd? get a grip retard, the 7900 xtx costs more than the 4080 and performs worse in most games while lacking less features
2 weeks ago
Anonymous
Eat shit uncultured animal
2 weeks ago
Anonymous
>the 7900 xtx costs more than the 4080
In what shithole?
2 weeks ago
Anonymous
on newegg? ur a spastic if you think the 7900 xtx is cheaper than the 4080, while it also lacks dlss and codec shit that the 4080 has
2 weeks ago
Anonymous
>on newegg?
US? My condolences.
2 weeks ago
Anonymous
>ur a spastic if you think the 7900 xtx is cheaper than the 4080
meanwhile in reality:
https://www.newegg.com/p/pl?d=4080
https://www.newegg.com/p/pl?d=7900xtx
7900xtx is available under 1k
2 weeks ago
Anonymous
2 weeks ago
Anonymous
I am not some tech illiterate mutt that can be fool by that shit propaganda.
Theres no method or computing or rendering magic that can magically make 1280*720 upscaled to 1920*1080 look better than native plain old 1920*1080.
2 weeks ago
Anonymous
Yeah but it can do 1440 > 4K better than native 4K because DLSS also has a fundamentally better AA process
2 weeks ago
Anonymous
If you want good AA, then use DLSS at 100% resolution. No upscaling shit.
2 weeks ago
Anonymous
You cannot tell the difference at 4K
Even straight bilinear or lanczos scaling directly from 1440p to a 4K pixel grid it's good enough even on a big TV anyway, DLSS helps just enough.
2 weeks ago
Anonymous
I remember playing forza on my 4k TV and not realizing it was 1440p upscaled for hours until I checked the settings menu.
2 weeks ago
Anonymous
Look fag you can lie to yourself and repeat that shit like a mantra all you want.
It isnt gonna become true.
2 weeks ago
Anonymous
>Complete denial
Suit yourself
2 weeks ago
Anonymous
You will never be a real 4K frame.
You are a 1440p resolution image stretched and mutilated with artifacts to be displayed on a 4K screen.
Real 4K native frames laugh at you behind your back.
2 weeks ago
Anonymous
Suit
Yourself
2 weeks ago
Anonymous
I had a 1080p 24 inch Benq monitor back in 2008.
My Ati radeon HD3650 256MB was barely able to output image for browsing and mundane tasks onto it.
I played dragon age origins at 720p upscaled to 1080p. And it looked like shit.
But then I got a HD4850 512MB and it was handling it all a bit better.
But man it wasnt until the R9 280X 3GB that it all fit together nicely.
All games running at native 1080p with high settings and good frame rates.
One thing that was weird when first 4K TV s started showing up and normalfags asked me for advice on what GPU and system to build for it, was hat there was no real 4K card to suggest.
Before 4090 I couldnt just say >thats top dollar most expensive shit but you get hardware that can do 4K
2 weeks ago
Anonymous
I refuse to converse with people that believe rendering the game at a lower resolution then upscaling it looks better than native resolution. When consoles do this we foam at the mouth but somehow literal peasantry has now become embraced in PC gaming. Actually pathetic. The only reason people delude themselves so completely is because lazy ports can't run on their systems at anything above 50fps if they don't overly rely on lowering the resolution to save the experience. Sad!
2 weeks ago
Anonymous
Both DLSS2 and FSR2 offer the best anti aliasing we've seen in years.
2 weeks ago
Anonymous
Not even close. SSAA and MSAA are objectively clear, you just don't use it because your system can't handle it. Which paints the irony of the picture because SSAA renders assets at a higher resolution then downscales them making for immaculate picture quality and sharpness while DLSS renders assets at a lower resolution then upscales them and applies post processing to blur out the flaws.
Maybe it's my gsync monitor, maybe it's the nvidia low latency i got enabled, maybe it's my almost 40 year old eyes but i never noticed any of the stuff you're talking about. I get around 100-130fps in MW2 with DLSS at 1440p. Sure, the game looks objectively worse in screenshots with dlss enabled but as long as the picture stays in motion there is no difference.
So many people in this thread talking about pointless shit, how about you save some money and buy a decent card instead of like a 4060 or some nonsense. I'm on welfare and I have a 4080, 12900k, 32gb ram, etc. WTF is your excuse for a poorfag pc, who gives a fuck about vram
Doesn't actually matter because they added more L2 cache. It's the same reason why you can use shit ram on 5800X3D, the ram speed doesn't matter because things are mostly in cache.
>nVidia is healing.
No it's not, they just got to a point where they weren't selling any cards so they had to lower the prices, still, that GPU should cost 200€ like a 1060 costed, not 500€ like the 4060 costs, this is a literal scam and you're falling right into it.
Depends on whether you want to fuck with AI or are worried about the AAA VRAM meme. Do you know the history of the 3060ti or are you just getting one off of ebay?
Another option would be to wait for the 4060 to drop in a week or so. It's supposed to retail for 299 burger tokens so you could probably get one for the same price as a 3060 if not cheaper.
the "I need more than 8gb of vramfor 1080p" meme was spurred by nvidia and amd shills and pajeet programmers who can't code.
You know what's the ideal amount of VRAM for editing 4k movies? That's right: 8gb.
And do consider the fact that, in theory, games should not be as demanding as editing movies or games.
And let's also not mention that games also use the CPU's ram, which is on average 16 GB.
No matter how high fidelity your game is, if it can't run well at 1080p with 8 GB of vram, it is simply unoptimized GARBAGE
End of story
You talk alot about resolution but that's hardly the biggest culprit. In average, going for 1080 tp 1440 only increases vram usage by like 1GB, if that. It's often the massive 4096x4096 (or even higher) textures that take up most of the space.
NTA but you're exactly right, remember how uncompressed audio inflated video game file sizes? Same shit is happening with textures and it doesn't make them look better. It's placebo and it only serves to make the new graphics cards with whatever developer has the sweetest deal look good.
Most modern AAA releases on PC are unoptomized and older cards in the same class released with more VRAM or for far less money. Gimping vram on next gen cards isn't going to change how developers make games.
>Gimping vram on next gen cards isn't going to change how developers make games.
This. You can:
A: Complain about unoptimized garbage and don't play new triple As.
or
B: Buy a card with enough VRAM to play unoptimized garbage
xx60 cards don't need more than 8GB regardless of resolution, since the card will throttle long before VRAM does.
Remember this thing is only like 10% faster than a 3060ti on a like-for-like basis. And nvidia is trying to garden gnome you into thinking you can use it for AI. kek what a shitshow.
No the 60 cards struggle because they aren't powerful enough to run modern unoptimised AAAslop on max settings. It has nothing to do with VRAM. It's a midrange card and people think you can max out graphics on it.
The 70 cards does need more than 8GB though, since those should be targeting 1440p resolution or higher.
the 3060ti is only around 15% slower than the 3070 you fucking moron
2 weeks ago
Anonymous
Check videos of a modded 3070 with 16GB or the A4000 (same chip, lower clocks). The card is not that powerful for maxing modern triple As.
2 weeks ago
Anonymous
What if the hardware of the cards are exponentially better except VRAM but the software of the video game are worse?
2 weeks ago
Anonymous
>but the software of the video game are worse?
This is absolutely the truth. The only thing we can do is buy a powerful card with as little bottlenecks that money allows. Either that or cry about it later that your card can't play new unoptimized games because of the lack of VRAM or low bandwidth (lol 128bit cards in 2023). In a perfect world everyone would have the money to buy a 4090 and be happy.
A card that could max out 1080p was a lot more impressive >6 years ago than it is in 2023.
2 weeks ago
Anonymous
Alright but if I only cared about 1080p would buying a x060 be good enough? The only catch that concerns me here is the vram difference that might act as a bottleneck considering how modern games are designed. Am I wrong?
2 weeks ago
Anonymous
It *should* be good enough. It's not that you're wrong, it's that these companies are wrong for thinking $400 for an entry level card is acceptable
>1030 >1010
These are for media playback on a setup without an igpu not for gaming
2 weeks ago
Anonymous
We're seeing senseless price increases across the tech industry. Be it smartphones or VR sets. The prices are only getting bigger and bigger. I share your same concerns but if I had a stable job I would probably bite the bullet since I am not the kind of person who upgrades parts often
2 weeks ago
Anonymous
Smartphone prices have been relatively stable and VR is a niche emergent market. I spent less than $1k total across multiple upgrades for a flagship phone over the past half decade or so.
You gays accuse every tech channel of being shills.
2 weeks ago
Anonymous
How do you think garden gnometube gays earn an income, retard?
2 weeks ago
Anonymous
Thank you for proving my point.
2 weeks ago
Anonymous
Making some gay video to show how well a game runs is not a job and takes no effort you retard.
They get paid in advertizements and sponsorships. Corps pay them to show their products on their channel for exposure.
2 weeks ago
Anonymous
Why because you say so? You retards accuse all youtubers of not having a job, no online video takes effort it's all fake blah blah blah. >but they have sponzerships!!!
I don't give a shit. You have no serious argument against their credibility, just the same tired accusations you use against every outlet and channel that covers game performance.
2 weeks ago
Anonymous
>Making some gay video game or making sure it runs well is not a job and takes no effort you retard.
2 weeks ago
Anonymous
>Making some gay video game or making sure it runs well is not a job and takes no effort you retard
Apparently it does since diversity hires can't program for shit and as it turns out don't have the skills to make a proper, functioning game.
2 weeks ago
Anonymous
Why would they when making a video game isn't a real job and takes no effort
2 weeks ago
Anonymous
Digital gayry is notorious for being shills.
2 weeks ago
Anonymous
Digital Foundry is notorious for triggering console fanboys by exposing their weak performance and getting accused of being shills.
Has this happened before in the gpu industry? The 3060 MSRP was 399 and has 12gb vram 192bit bus, I got a used one for $240. Yeah there's a decent performance improvement, at a higher cost, with other worse components and less ram
>650w
I have a 650w titanium Seasonic PSU. This should be good right? My current GPU is dying, it would randomly black screen when playing any 3D AAA slop but it's started doing that while I'm just shitposting and not even gaming.
Nvidia doesn't care about the consumer market anymore. They earned a shitton during the great cryptocucking and now they earn a shitton on big corpo AI shenanigans. They don't need us.
>Nvidia doesn't care about the consumer market anymore.
And AMD doesn't care either. They make so much more money selling their enterprise GPUs than consumer GPUs, it's not even comparable. The difference is huge.
We're all fucked. Only Intel can save us but they somehow have worse drivers than AMD.
Why do you think they are going on about cloud gaming so much, price gamers out of owning their own hardware and push them onto another subscription service
There isn't one yet. They have no idea how to fix their RDNA3 crap. It's full of bugs and high power draw. This is why they're taking so long to release more cards.
Incidentally, the only card in that entire list on the AMD side that was their top end offering that generation is the RX 5700 XT, since RDNA 1 never got a big GPU. Can't wait to hear about the "many generations (generations)" I missed out on which have also been scrubbed from the internet.
The 7000s were nice, but were the start of AMD really pumping dem volts into the high end to stay competitive pre-Pascal, a Sapphire 7970 is probably the worst card I've ever owned in terms of reliability. The button on the card let you switch between >really high stock overvolt
Or >VRM toastingly retarded overvolt with 8% OC
I had to bake that fucker back to life multiple times, which sucks, because that card was a fucking beast. And then Catalyst transitioned to Crimson, and after a bunch of really bad AMD replacements, my Fury X became a 1080 ti
We seriously no shit need HBM on future GPUs if we wanna solve the VRAM/high resolution crisis. Fury was a terrible showcase of HBM, but the HBM itself let it punch a lot higher than it should've, speed, bandwidth, and timings are insane, but 4 FUCKING GIGS ON THE CARD THAT WAS SUPPOSED TO SHOWCASE HBM. I couldn't run Mankind Divided at all on that thing without turning everything down to very noticeable medium-high compromise settings, and it also released during the move to Crimson, games just didn't fucking work, the control panel never applied settings. The 400s and 500s ran circles around it just because they had 8 and 12GB, 4GB sealed it's fate, and Nvidia hoards it for data center and AI bullshit
Was considering getting the 4060 ti 8gb instead of the 4070 because of how much cheaper it will be but the performance gap will probably be way bigger than 3060 ti vs 3070
I don't really play new AAA games that need a lot of vram but would like to max out some older titles at 1440p while getting over 100 fps
Sorry but not really, honestly.
I don't see why you'd pay so much more for the same amount of ram as the base 4070.
If you were gonna spend close to 1000 bucks on a card you should just go for the 4080 honestly.
>Useless RTX crap
not paying for that. give me actual GPU power, why would I waste my money on gimmicks I won't use most of the time? RTX should be an optional thing and the RTX chip should be a general thing that provides power for all games even without RTX.
AMD card atleast have more power for texture and shit, instead of wasting it on raytracing.
But adding 6 gb ram is not expensive. VRAM is one of the most important people characterisistics that plebs pay attention to when buying cards, they will absolutely buy a card over another one, or not buy it because becaus of few gigs of vram.
If olds cards have more VRAM they wont buy this one. this deal is not better than used 3060 12 GB. if your new cards is not better than buying old used shit then its not a good product
The only other configuration possible for 4060 would be 16GB and as you can see nV is asking steep price for additional 8GB of VRAM.
4060 is 1080p card at best, low end gamers don't have high standards but they care about the price more.
2 weeks ago
Anonymous
what do you mean low end, this is not a low end card, it runs the games. 400 may be low end for crazy prices but for people 400$ is still a lot because they dont live in a GPU world with new gpu prices, they live in a real world where they pay their bills. people who are not willing to spend a lot of more money but tech savvy enough to build their PC and buy a single GPU are actually quite picky and try to go for the best gpu power per dollar.
2 weeks ago
Anonymous
4060 is low end card, 4050 whenever it will launch will be entry level GPU for this generation, just because we don't get many $250 GPUs anymore doesn't change that.
Buying low end hardware is stupid even if you are poor because those GPUs age poorly.
If you can't afford $400 every 3-4 years you need to stop playing games and invest in yourself.
Gaming as a hobby is very cheap it's the main reason why it's so popular.
I'm lucky to be youropoor, I can just move to another country whenever I like.
2 weeks ago
Anonymous
>Buying low end hardware is stupid even if you are poor because those GPUs age poorly
Tell that to my GTX950 that still plays everything I want to play.
2 weeks ago
Anonymous
>everything I want to play.
This is coping mechanism, I know this well. Once you taste freedom you won't be able to go back.
I used to raid in wow with sub-10FPS, I would consider this now unplayable.
2 weeks ago
Anonymous
You actually are a paid shill, aren't you. Nobody can be this stupid. Go compare Vanilla WoW spec requirements to Classic. It's a fucking joke.
2 weeks ago
Anonymous
What are you even talking about?
I'm just saying that when I had shit hardware and I were younger I could accept shit gaming experience and now I would not enjoy that even a bit.
2 weeks ago
Anonymous
but we do, rx 6600 is a 3060 equivalent and costs 240 bucks new + a free game (right now it is sold with the last of us 1), in NEW condition of course, in France (so not a cheap country), probably even less in the US
2 weeks ago
Anonymous
nGreedia has near monopoly, many gays don't even know AyyMD sells GPUs.
Gaming laptops are also very popular among normalfags and those very rarely come with Radeon as well.
But yea 6600 is good enough if you enjoy sub-cucksole experience.
2 weeks ago
Anonymous
>Buying low end hardware is stupid even if you are poor because those GPUs age poorly.
Once upon time there were sweet spot GPUs in every gen usually around mid tier.
Like the reasonable priced option that performs ok.
It was 1060 6GB, RX 570 4GB, 750ti, R9 270X, HD5750 (same chip as 5770) 8800gt 512MB..
2 weeks ago
Anonymous
Yes that was the case. You can still play 95%+ of the games on older hardware. I would remove 750Ti from your list it was trash from the beginning.
It's just games are made to run on cucksoles first, plus any corporation just cares about profit. So if customers can keep their hardware for long they miss out on sales.
If you are fine with 1080p 60FPS you don't need to upgrade often.
I'm just elitist prick, that's all.
2 weeks ago
Anonymous
750ti was real cheap used and performed well over what people paid.
Like if you could buy 1050ti at $50 or 3050 at $100
2 weeks ago
Anonymous
It came so late it was super slow card, 9xx series was launching a few months later. I know nV skipped 8xx cards on desktop.
It was poor value, used market is separate thing because it varies drastically from country to country. Even now retards are trying to sell 750Ti's close to the price of rx570.
2 weeks ago
Anonymous
Theres millions upon millions of them for a reason.
2 weeks ago
Anonymous
it was literally a launch card for the Maxwell series
its best strength was requiring no auxiliary GPU power and enabling normies to slot it into any kind of existing or second hand Walmart tier PC and get a comparable experience to consoles
Every 40 series card except maybe the 4090 is selling like shit and nvidia is starting to feel the pinch as shown by the very slight price drops and slowing of production. Everybody just has to keep on being patient.
bought a 6800xt and waitfagging for 5000 lineu. this generation is a wash, just save-up a $1000-$1200 for next gen. in a year or two get 2x32GB DDR5, 14th gen intel CPU+mobo, 4th gen NVME and your good until 2029.
I've been using a nvidia 1660 TI for ages and it's really showing its age at this point. Are there any graphics cards that are decently priced that I should aim for? I do like to play modern releases.
I think it's enough if you don't plan on playing the latest AAA games at max settings for the next 5 years because of the vram meme. If you're targeting 60 fps then yes, it's definitely good enough maybe even slightly overkill but you'll be future proof.
I'm personally considering 4070 for 1440p 144hz on older games so vram isn't much of a concern. I mean it will definitely run new future games but without ray tracing or some extreme settings.
My bigger concern is how to go about my old A320M-A motherboard and cpu. I refuse to buy new motherboard chipsets and DDR5 memory.
>developers are lazy and openly admit how they operate under the philosophy of just brute forcing their badly made games through >they blame the hardware when they can't brute force
Of course they're both fucking the customer. There's no reason for Nvidia to garden gnome out on VRAM because it's cheap as dirt for them. Developers need to LEARN TO CODE and stop using Tim's hekkin' chonker shit Unreal Engine for everything. Also stop adding polygons to models, you've added enough.
Nvidia shills are truly retarded subhumans. It's even more funny when they ignore the fact that there are GPUs out there right now with 16GB for less than $400 like Intel Arc A770. If you aim for budget and still pay more for less you are by definition a retarded subhuman.
I have a 6gb 2060
Would it be worth to get a 4060? The fact that it draws less power than my 2060 and that it's almost double the power seems interesting
The only big game I play ln my 2060 is modded Skyrim and I have roughly 45 fps which isn't bad, other than that I mostly play older games from 2010s
However the noise on my 2060 is bothering me when it's under load, so I had to put the fan to 45% under load and undervolted it resulting in a roughly 5-10% power loss but still...
I would like something kinda future proof that will work for the next 5 years
Might just wait for the rtx 5 series at this point, my 2060 is still going strong...
>$400 for a fucking low tier GPU
are people fucking insane? I just paid like 550 for a 6800XT with 16GB. All for this RTX bullshit? Eat my ass, there's pretty much no games worth playing that take advantage of this unless you're a AAA goyslop consoomer.
I bet the 4060 regular version will barely outperform the 3060TI while being more expensive.
In fairness, there's a good reason why Nvidia can get away with charging this shit, since their GPUs are also now used for AI.
Plus, there's the fact that they know they have consumers by the balls in terms of overall power efficiency, brand reputation, better drivers than AyyMD, and software. DLSS as an upscaling solution is so much better than FSR its not funny.
Is it true that I need a good CPU if I'm aiming for high fps, and a good GPU for high resolutions?
So that means I should just get a 4070 but a i9 with it?
How so? 970 was a card that could barely run Far Cry 3 on ultra in 1080p 60 fps. Your standards have just changed. You want everything to be so much more. Touch grass.
Learn about basics of computing, learn about how programs are compiled how they are executed, then learn a bit about CPUs and memory, then find out what you really like doing with a computer.
Like if you really enjoy some game, find out in what engine it was made, what kind of system requirements it has, find it running on shittiest lowest setting, find it running at the best possible imaginable settings and hardware.
Watch a few videos about monitors and how they work and about their resolutions and refresh rates.
So after your understanding of everything broadens and deepens you can understand each component better.
I was lucky my favorite game was Quake 3, and I could even look into the source code and change random texture files and sound files and do whatever I wanted with it.
Biulding some poverty gaming computer can also help you heaps.
Like buy some shitty used office computer wiht some old i5 like the i5 4570, then add some old gpu into it, change the PSU, add more ram, add an ssd, install an os.
Like waste $250 to $300 tops to give yourself a learning experience that can only come from real world practice that you do yourself.
Can you even tell the difference? I got 3070 Ti which supposedly has very fast VRAM speeds but I can still easily tell when it's loading in the textures for the first time.
It's because we've reached the limits of what they can do with current designs and nobody has any idea of what to do next. At least AMD has chiplets, vcache and infinity fabric to play with. It's almost time to do another unified pipeline and mix shaders with AI.
>morons want the top tier GPUs at 200 dollars to play their goyslop AAA kusoge >in this inflation
you'll never EVER get entry level GPUs for cheap ever again
it's fucking OVER
I'm not gleefully going along with anything, I'm a 1060 at 1080p chad and I'm perfectly fine with this turn of events >things are more expensive because they are
yes
you think you're being smart with that clever "haha gotcha" but that is literally how things work
Diminishing returns with SLI and requirements of rendering frames in real time and needing the data from the last frame to render the next one.
SLI and crossfire never really took off.
4060 Ti is the same MSRP as the 3060 Ti and has a decent performance bump, or a massive one if you are ok with DLSS3. Keep in mind the 3060 Ti was released before Bidenflation really kicked into gear. Nvidia is fighting tooth and nail to keep GPU prices low and it seems only I'm smart enough to see it.
there is nothing wrong with these GPUs as long as they're priced appropriately
Imagine it was
$199 4050 Ti 8GB
$249 4060 8GB
$299 4060 16GB
instead of
$299 "4060" 8GB
$399 "4060 Ti" 8GB
$499 "4060 Ti" 16GB
Would anyone actually complain?
SLI was incredibly janky shit to get to work, and even then it didn't work a lot of the time from what I recall. There's a reason gamers gave up on it.
>memory bus as narrow as Nvidia CEO's micropenis >the cheaper model has the same amount of VRAM as my GPU from fucking 2014, almost literally ten years later >the model with somewhat acceptable VRAM for 2023 is the price of a midrange card while touting itself as an entry "budget" model
>the cheaper model has the same amount of VRAM as my GPU from fucking 2014, almost literally ten years later
Same amount, different type. GDDR6 is much faster than GDDR5 or GDDR5X. Not disagreeing that the current generation of Nvidia cards is a blatant scam (other than the 4090, which is fantastic but massively overpriced), but 8GB of VRAM from 2014 is a lot slower than 8GB of VRAM from 2023.
not the other guy but
if he's talking about an 8GB 2014 GPU, I'm assuming he's referring to the r9 290X 8GB, which had a bandwidth of 320 GB/s
4060 Ti has a bandwidth of 288 GB/s, 4060 272 GB/s
Nvidia wants you to downgrade to their dlss 3 blurry interpolated crap to use less memory so the gpus can mine secretly in the background on special hidden chips in the management engine
Can relate, tried AMD but something just felt off, a general degradation of image quality even in 2d despite using the same monitor and cables.
It's really true, Nvidia is unbeatable in all aspects.
Rtx 5090 will be the same speed as the 4090 but twice as expensive nut has dlss4 which gives you a 4x boost in fps and res over dlss 2 but only up to a 2x boost over dlss 3
I dont even known why normies bother upgrading anymore anyway but lets pontificate on the card.
Will be even better at ai rt etc but barely better in raster as they eill force new path trace shit to gimp their older gpus now that everything is good enough in raster even without upscaling interpolation shit
Its slower than my 2080ti stock which is slow as balls and costs almost as much (barely anyone bought one new so they got dumped on the used market cheap as shit either side of 2019-2022 for peanuts never seen them go above 300usd)
NVIDIA locking dlss3+ to newer cards just to gimp the older ones is disgusting. Intel and amd will obviously follow suite with frs and xess
If the 4070 TI can match the performance of a 3090, then the 4060 TI should be equivalent to a 3080, not a 3070. It even costs the same as a 3070 at original retail price. And why, WHY didn't the 4070 series have 16GB of VRAM?
The fuck does the memory speed even realistically matter. If there is texture loading in then you will see it regardless of the speed. This is like those morons comparing 800:1000 contrast to 700:1000. You can't tell the difference anyway.
i got a 4070 on release and am happy
I would if it wasn't the same price as the 3080.
I got a 4090 and the room that's in is a furnace lel
>8gb
>1080p graphic card
>8gb
Yeah all correct
graphic card
>>$399
you need 12gb for ps5/xsx ports
If you want to play them in 1440p you fag.
nah, 1080p maxed out 8gb isn't enough
who plays with any less than 1440p in 2023? Other than pajeets and poorfag 3rd worlder
There's fuckall visual difference between 1080p and 1440p. At least upgrade to 4K if full HD isn't good enough for you
outed yourself as a modern "AAA" slopdrinker
1080p has been fine for the actually GOOD PC games made anytime in the last 15 years and no current game even approaches the merit of benefitting from a higher spec
instead these profligates spend $3,999 on hardware just to run snoyslop remaster remake port of jedi shitshow at buttery-smooth 24fps in glorious 1440p when you could just replay one of the hundreds of worthwhile games from the last 10 years (or good retro emulators that hardly even need any graphics card at all)
If you only play the so called good games then 1440p is not too hard to run. When I got 144hz I went through so many older games and they all felt so good.
Source?
>consoles move closer to being outright PCs compared to the last generation
>pc port optimization trends further downward
That's because optimization is the last step during development, and so execs just straight up remove that step nowadays to meet the deadlines. Plus, a lot of codding is now done by cheap third world labor and dumb kids straight out of collage so that kind of people couldn't optimize for shit even if they tried.
Pssst gfx manufacturers and triple digit games have a deal.
This. The only thing that can save gaming is economic collapse or Indie Dev stuff exploding due to chat GPT and other AI stuff creating an exponential jump in the work teams of fewer than 10 people can put out.
they make more money by convincing people they need a console for the optimal experience. People who build their own pcs are an incredibly small minority. If franchises like CoD sell millions every year apply that IQ level to every avenue of media and you will understand really how braindrad retarded consumers are that they actually believe a console is worth the money compared to a midgrade custom pc that can emulate any game in history at better performance and resolution but putting in new hardware that literally clicks in place like legos is still too difficult for them to comprehend.
This is true to an extent, some indie games and messier console ports will drag you down pretty bad even with 12 GBs. Everyone knows Jedi Survivor is a mess, and in my experience Hell Let Loose will drop to single frames running at 3/4s of max settings. But from friends’ experience and from what I’ve read 8 GBs is good for 80% + of gaming
>8 GBs is good for 80% + of gaming
this is truth for the next couple years.
i wouldn't ""upgrade"" to another 8gb card but if you've got one you're fine. if you needed a new card and are short on cash, 8gb is fine but don't get the 4060 because it's gimped.
it's easy to cope when my 1070ti ate elden ring for breakfast. i'll shell out some big bux in a few years and get a high end 50xx or 60xx.
if my card died right now i'd get a 3060ti
no, i'd rather have a trash card and continue to game on pc
>1080p
That thing is going to bottleneck like a motherfucker at 1080p.
How strange, we had cards advertised as "1080p" for well over a decade. Are you saying that 1080p in 2023 is still somehow $399?
I played Crysis at 1080p on my GTS250 about 15 years ago. Ignore this thread and the retards in it.
You're not maxing out Crysis at 1080p on a GTS250
Can't 1080p with a 8gb card anymore.
Why not just buy a 3060
>MUH 8GB IS BAD!!!!!!!!!
Hardware Unboxed is such a fucking gay for starting this disinfo
wow look at that turns out you can run "next gen" games with 8gb vram when you actually put effort into working a release
cope
modern games frequently eat up to 16gb or more of ram, and even if they didn't the more room you have to breath overhead the better for performance it is, also posting in an advertising thread
yeah if they're coded by pajeet monkeys I agree with you
I ran RE 4 Remake at 4k 60 on my 3070ti btw
>turns out you can run "next gen" games with 8gb vram when you actually put effort into working a release
you can't even emulate ps3 games at 1080p with 8gb vram
>rpcs3
>gpu dependent
thanks for proving you're a retard
resolution scaling is vram dependent
>you can't even emulate ps3 games at 1080p with 8gb vram
Source?
It was revealed to him by nvidia
I play PS3 games at 4K with a 1060, stop the bullshit
>emulation uses RAM
Really? First I've heard of this.
it's only going to get worse, dummy
i will stick to my 3060 12gb, 1080p 60fps, thank you very much
>d4 beta
>people were reporting memory leaks with gpus with 12 gb on ultra textures and in rare cases gpus were getting bricked
>start sweating profusely
>try the beta anyway despite the risk
>all settings on max except blur and other eye cancer inducing shit
>a sudden relief came onto me
>I've been an amd chad with 16gb and on linux and couldn't be affected anyway because I didn't fall for leatherjacket man's lies
feels good man
>shill coping sounds
and if you want $15 more bucks of ram that'll be $100. apple tier pricing lmao.
>$100 for $15 of ram
That’s not all your paying for.
Your paying for the GeForce downgrade program(tm) where they downgrade textures and introduce pop-in.
Already happened on the updates that “fixed performance” in hogwarts legacy and TLoU.
So in a round-about way, even by buying the card with an adequate amount of memory, you’re still paying to maintain the status quo of VRAMlet performance.
KWAB.
Examples like these are just bugs. Something that modern games now release with. Textures that use 7 gigabytes don't look like that.
They aren’t bugs they’re literally updates made months after the game is out.
You’ll notice the VRAMlet card (on the left) gets “higher fps” but the experience is shit.
That’s a feature not a bug, meant to fool idiots who only look at fps numbers on a spread sheet to decide what’s good.
Well maybe you are not pretending. Maybe this really is your first year of playing video games. And maybe you really don't know how good textures on even just 1 gb can look like.
Sure if you don't mind buying a new video card every 2 years.
>People will defend anything that Nvidia does
For 330 bucks you could have bought an 8gb card in 2015 (R9 390). 4060ti is more than that and we are still stuck with 8gb even 8 years later (which is more like 80 in tech world)
IM ON 4!
Remind me what are some absolute GOAT masterpieces I can't play with 8GB of VRAM. I'll wait.
anon, the situation will get worse, not better
sooner rather than later, some random indie games will require 12gb vram
>but in a perfect world devs would optimize their games!
cool
at some point people will just get a console because nvidya is going retarded with gpu prices
To be honest i don't need more of 8gb and for that price is fair enough for me.
>*RTX 4050
>128bit
Won't be usable for AI despite the 16gb ram?
But muh L2 cache.
how much bus and ram is needed for Aigger?
As much as you can solder onto a pcb.
>128 bit
lmao
the absolutely fucking audacity of those fuckers
For reference, the x070's, aka
>fine, don't buy the 4080 OR 4070 you poorfag retard, fuck you
>Want to upgrade since my 1080 doesn't have enough VRAM for AI
>4060 still only has 8gb
What a fucking scam.
It’s a —60 class card. You’re dumb for expecting more, and plus there’s a 16GB variant for $100 more.
>It’s a —60 class card. You’re dumb for expecting more,
kys shill
>and plus there’s a 16GB variant
>for $100 more
exactly
3060 had 12GB you fucking moron
There is no excuse for this
No excuse for what? There's literally always been two versions of xx60 cards with different VRAM
1060 had 3GB and 6GB
3060 had 8GB and 12GB
4060 has 8GB and 16GB
Like are you retarded or what? This is no different
>literally always
>doesn't mention 2060
I was too lazy to look up the 20 series because nobody owned one
But the 2060 also came in 6GB or 12GB
4060 is 8 GB only
now compare the price
You're dumb to expect anyone to fall for such a lazy line and your employer is dumb for expecting 8GB to sell for anything over $300 in fucking 2023.
the 4060 ti sounds weak for a $500 card in 2023 based on its specifications
the quantity of vram it has is nice but it seems like it would shine in AI as opposed to vidya
What retard pajeets coded the launch version holy shit
"iron galaxy"
>driving force behind the rhetoric that you need 128gb of vram to play games are just games that are outright broken upon release
Due those shitty ports, retards believe it will be necessary up to 16gb minimum to play
in fact shitty ports are intentional to make them believe that
not just bad ports but also drama queens on youtube that push this narrative becouse it gives them click.. or they just want to shill amd.. With i would have nothing againts with amd was not just as scummy as nvidia. The good guy vs bad guy in the market is always completely silly.
Killzone 2 would be impossible without at least 8GB of VRAM and 16 RAM these days. Why? The developers. Why would you subsidize incompetent development by demanding the ability to purchase a video graphics adapter with at least 16GB VRAM? AMD shills do not understand this.
I remember it was giga impossible to play FPS on console, so I just cheesed with that spy classes camouflaging and shanking people from across the room with insta kill lock on. The assmad was unbelievable carrying those games
>AMD shills do not understand this
You are just outing yourself as an Nvidia fanboy. "Hurr Nvidia is right they should give us less VRAM!". have a nice day.
I have an AMD 5700xt at the moment. I am not subsidizing developer laziness or incompetency with my own money and neither should you.
That's why I'm still on 8GB of RAM also. I am not subsidizing developer laziness by buying another stick of RAM. Fuck the RAM industry garden gnomes.
you're really sticking it to the man by having a suboptimal pc experience
>a suboptimal pc experience
What sort of dumbfuck shilling is this?
Is that actually your setup?
Didn't killzone 2 run a 640x480 or some ridiculous resolution like that?
It ran at 720p but they advertised that it would release at 1080p. Snoys also expected 60fps and it would kill crysis as the king of graphics with the almighty CELL
In rpcs3 it has the option to select 1080p, not scaling, just default. Dunno if it is just the emulator or the original game itself.
Killzone 2 supports 720p and 960x1080.
See:
>Killzone 2 would be impossible without at least 8GB of VRAM and 16 RAM these days. Why?
Because Killzone 2 wouldn't target 720p with low res textures.
It's funny you're using Killzone as an example. A shitty looking game with tons of particle effects thrown in to hide the garbage textures and low res rendering resolution.
vram inflation is good because itll be cheaper to proompt based curries
>What retard pajeets coded the launch version holy shit
iron galaxy, the company that brought you such hits as arkham knight which took a year to fix.
Iron Galaxy, the company so fucking bad that it's now become notorious for shitty ports. But someone, somewhere, keeps managing to get them business. How, I can't tell you. Even if they were cheap surely the followup costs to fix the fuck ups for months would negate that?
It's fucked up that they used to do decent stuff
>How, I can't tell you.
It's probably a case of their services being cheap and companies not bothering to get a good porting company if they can pay as little as possible
Sony own nixxes, it's just corporate retardation.
They lose more money from a bad port than they could save.
Textures looking like they traveled to 2007, no this is worse.
OG crysis and STALKER games has better textures than that pos, and they only needed 512MG - 2GB cards.
>Give the low end model extra vram
Why?
so rajesh can afford it for ai slop
>AI
A used 3090 is around 750$
The AI coomers need 16gb
Only 6 actually
Fucking exaggerated
you want bus bandwith for ur vrams moran 128 is going to fuck ur workload
>1080p card for $400
woah what a deal
>10% faster chip than the 3060
>at 20% higher price
>with 25% less memory
a 4050 for 400usd
>8gb
It takes manual extra work to make 8gb cards work. What happens when next gen nvidia does up the vram amount? You can kiss that extra dev effort goodbye. An 8gb card is a ticking time bomb that will be broken on release in many titles and may not ever get working if the devs don't care enough. Having less vram than the consoles that games are made for is a bad fucking idea no matter how hard shills try to lie to you about it.
expect this narrative is false and again only show that people do not have opinion on their own and just follow the herd or what they favorite youtuber said instead of doing own research about the problem.. consoles do not have dedicated 16gb vram just for rendering but also for system that takes 4gb of it and some of the vram also need to be shared with ram.
The textures are made for 11 to 12GB vram. Then they just put them into some automatic tool to resize them for mid and low settings. Like it can be some shitty photoshop action macro you can make in 30 minutes and leave it to run on thousands of files.
The results are shit you can see in all of those games that are benchmarking high vram usage with high and ultra textures.
Also shared with ram... you think modern games only use what 2gb ram?.. jesus fuck
They literally are clueless morons adding prefab assets into engine dev kits.
There are no work hours paid to any of them to do any sort of optimization.
Console OS reserves around 4GB of memory for background processes and whats left of the shared 16GB is what they aim at and use.
And its like you are one of the moronic shills that knows exactly what I am talking about but is intentionally playing to be a retard.
Devs do not even try to use system ram more or to optimize the flow of how and which assets are loaded so you can have 64GB of fastest ddr5 and its worthless if your shitty ass game runs like shit ona gpu with just 8GB of vram.
They do this intentionally. Thats how the top down corporate orders are from the garden gnome shareholders.
moron gays garden gnome tool.
Oh, then it's a good thing the 4060 has 12gbs of vra...
Dude, you bought an 8gb card because you're poor or just made a stupid mistake, but stop trying to throw people into the same hole you fell into as a coping mechanism.
Anyone reading this thread, DO NOT BUY an 8gb card, you will regret it.
I did actually buy 4070ti.. but the whole narrative is stupid.
I bought an 8GB RX 480 in 2017, then an 8GB RTX 3070 in 2020.
They shouldn't be releasing 8GB mid range cards in 2023, it's a fucking disgrace.
You're a fucking retarded moron.
>nVidia is healing
not until they cut out the ai cancer and and make gpus again
What unoptimized AAA cancer are you retards playing where 8GB isn't enough to game at high settings at 1440p?
any game released in 2023
werks on my machine
I played RE4R at all high settings on 1440p with an average of 90FPS with my 8GB card lmao
Probably TLOU on release
Hogwarts
TLOU
Jedi Survivor
Forflopen
Callisto Protocol
RE4 with RT
> inb4 just turn it off
Plague Tale Requiem
Hogwarts runs fine on my rig. Too bad it's just another cookie cutter ubishit game.
The rest is all trash as well so I haven't played any of it.
>Hogwarts runs fine on my rig
It was heavily patched to "better" stream textures. They still take a very long time to load and until they do they look worse than a PS1 title. This is why it runs so well now on low VRAM. Before the patches the game was stuttering like mad because it loaded too many high resolution textures at once.
But I pirated the 1.0 version. Having no trouble.
So it's an optimization issue, not a "low" vram issue.
Denuvo causes a ton of problems too.
>So it's an optimization issue, not a "low" vram issue.
Partially, yes. But there's only so much a game can do to fit all its assets in VRAM. Texture streaming on low VRAM will always look bad.
>Texture streaming on low VRAM will always look bad
So just load in textures beforehand? Everyone has at least an SSD these days so there's no excuse for devs to not be able to implement this.
>So just load in textures beforehand?
No space. Will lead to stuttering because of VRAM running out. Texture streaming is a necessity.
God damn but do you gays have horrible taste in vidya.
ok, you have great taste in vidya.
Now leave.
No.
Re4 runs fine in 4k on a 2070s
high res vr games with h264 encoding for oculus link
128
Fuck off nog
kinda unrelated but if you like generating AI shit with stable diffusion you'll want to max your vram and as someone who thought he'd be on a gtx1080 (8gb) for a decade im now seriously looking at upgrading because it's so fun to generate pictures and the 1080 is at a bad spot where the next gen literally takes half as much time per iteration
Don't need it to play Deus Ex.
>an entry level card is now $400 instead of $200
lmao
?
Entry level cards are still under $200.
The 4060 is not an entry level card.
>The 4060 is not an entry level card.
1080p is entry level
4060 is console tier three years later
The console equivalent card is something like a 2060 super
consoles are 6700(xt)
4060 is a scam
They are not.. in fact they are slightly slower than the rtx 2060 super
>meme tracing
cope harder. https://www.youtube.com/watch?v=I-ORt8313Og&t=
It is just meme i swear!
never change.
Actually it is more close to the 2060 with no super and RX 6600 XT
>Actually it is more close to the 2060 with no super and RX 6600 XT
and yet it will give x2 performance of a PC with 2060. even 4090 struggles to run latest ps5 ports smoothly while ps5 has no issues. PC gaming is a meme.
PS5 runs them below 1080p and below 30 fps.
>consoles are 6700(xt)
lolno?
Their real world performance is closer tot that of a 5500xt. Their games also only run on low-mid settings with upscaling out the ass.
please direct me to an entry level card
last i checked it was supposed to be the rx460 but they didn't actually exist
and dont even get me started on the gtx1030
GTX 1060 6GB would be $377.93 today
1060 was $250
>only aaaa game that might be worth playing on the horizon is starfield
>no reason to upgrade for it before significant mods are out in a few years
You are genuinely retarded if you buy a 40 series card.
>you need more than 8 heckin vee ramerinos
>what about all the frickin' NEW SONY PORTED GAMES AND UBISHIT TOWER SIMULATORS! DON'T YOU WANT TO BE ABLE TO RUN ASSSCREED 1000 WITH MEGA GRAPHICS AT 16k
I've bought three games in the last 5 years, why would I give a fuck?
Never a better time to buy a console and wait a few years to buy a used gaming pc that plays patched and actually working games.
>please bro buy the 16GB bro you really need it bro trust me bro
>why?
>to play the latest AAA slop of course
I haven't played a single (non-indie) game that released in 2022 or 2023
>nVidia is healing.
not good enough. a xx50 used to be 200, a xx60 used to be 250-280, a xx70 used to be 320-360, a xx70Ti used to be 400-450, a xx80 used to be 550-650, a xx80Ti used to be 700-750. now those cards are priced relative to their power. so a card that delivers great performance at 1440p today has the same price as a card that delivered great performance at 1440p 2 gens ago. gone is the "oh just wait a year you're going to get something better for cheaper", now it's the same price for the same performance just one card tier lower. in a couple years we're going to see xx50 cards that deliver 4k performance that cost as much as a current gen 4k card.
>a xx60 used to be 250-280
When? My 3060 cost me 350€ 6 years ago
>3060
>6 years ago
welcome, time traveler
Meant 1060
msrp of the 1060 was 250
Doesnt mean that thats what stores sell it for
>I overpaid like a retard
>what are taxes
msrp prices are only for the americans. for europe you'll get taxed like hell
Americans always forget to mention they get taxed as well but it depends on the state how much you pay in taxes. For some reason they don't include tax in the sale price, they only add it on top at the counter/checkout which is pretty scummy but hey they Americans love emulating gnomish practices such as mutilating infants and having mandatory tipping at restaurants.
worth every penny
Now post the 8GB, you filthy shill.
>RE got 30% fps increased with across all GPU in 8gb chart
LMAO WAT?
Read small print at the bottom.
>Disabling RT at 1080P so it won't crash
LMAO @ VRAMLET
499 US DOLLARS for single digit fps increases at (upscaled) 1080p lmfao
>1080p
>comparing normal rendering on old cards to DLSS3.0 fake fps
>1080p with Super Blur Motion Sickness mode on
Nice.
>fake frames that magically doesn't work on previous gen tensor core
>even with fake frame its only 30% more on average
>without fake frames its 10% on average
...and Nvidia think that its worth paying for shills to convince people to buy this shit lmao.
>marvels guardians of the galaxy
so we just pretending these are games now
Isn't it worse than 3060? Anons said it has less cuda cores or something. It means it's gonna suck for AI, right?
yeah but it is dlss 3.0 compatible so if you're building a 1080p/1440p pc and don't mind enabling dlss you're getting great value.
I guess no point switching from 6600XT. I'll wait for 5070 or something...
poorfag
My salary is around 100$/m these days, yeah. I'm literally eating less to save for PC upgrades...
Seriously are you for real? You have a better card than me (I'm on a 6600) and I make like $3k/month.
I think even third worldies make more nowadays.
Yeah, no joke unfortunately. Just got really unlucky in life and currently can't even get a better job. PC is literally the only thing I spend money on, even my clothes are super old, I need buy some this year I guess.
No, we don't.
that's nearly a dream wage lol, only some seniors earn that much.
Are you a high class gigolo? You must suck some mean cock and be into some degenerate shit to be earning that kind of cash.
>animeshitter
pottery
what? he said 100 a month bro
that's like 1200 a year
I got a 2080S and i'm kinda sorta maybe considering rebuilding even though i really don't need to.
So it's compatible with the feature that introduces artifacting and input lag, and is only even remotely playable at high fps which the 4060 will not be getting. Amazing value indeed.
dlss looks better than native, and the input lag generated is imperceivable
>dlss looks better than native
have a nice day corporate shill
cope, i bought a nvidia gpu and it simply works better than amd and dlss simply looks best
the guy even says its not noticeable in motion and specifically only has criticism of frame generation (again hard to notice while gaming), looks perfect to me
>upscaling shit looks better than native
>input lag imperceivable
Get cancer garden gnome tool
Stop kvetching, goy.
whats the alternative? amd? get a grip retard, the 7900 xtx costs more than the 4080 and performs worse in most games while lacking less features
Eat shit uncultured animal
>the 7900 xtx costs more than the 4080
In what shithole?
on newegg? ur a spastic if you think the 7900 xtx is cheaper than the 4080, while it also lacks dlss and codec shit that the 4080 has
>on newegg?
US? My condolences.
>ur a spastic if you think the 7900 xtx is cheaper than the 4080
meanwhile in reality:
https://www.newegg.com/p/pl?d=4080
https://www.newegg.com/p/pl?d=7900xtx
7900xtx is available under 1k
I am not some tech illiterate mutt that can be fool by that shit propaganda.
Theres no method or computing or rendering magic that can magically make 1280*720 upscaled to 1920*1080 look better than native plain old 1920*1080.
Yeah but it can do 1440 > 4K better than native 4K because DLSS also has a fundamentally better AA process
If you want good AA, then use DLSS at 100% resolution. No upscaling shit.
You cannot tell the difference at 4K
Even straight bilinear or lanczos scaling directly from 1440p to a 4K pixel grid it's good enough even on a big TV anyway, DLSS helps just enough.
I remember playing forza on my 4k TV and not realizing it was 1440p upscaled for hours until I checked the settings menu.
Look fag you can lie to yourself and repeat that shit like a mantra all you want.
It isnt gonna become true.
>Complete denial
Suit yourself
You will never be a real 4K frame.
You are a 1440p resolution image stretched and mutilated with artifacts to be displayed on a 4K screen.
Real 4K native frames laugh at you behind your back.
Suit
Yourself
I had a 1080p 24 inch Benq monitor back in 2008.
My Ati radeon HD3650 256MB was barely able to output image for browsing and mundane tasks onto it.
I played dragon age origins at 720p upscaled to 1080p. And it looked like shit.
But then I got a HD4850 512MB and it was handling it all a bit better.
But man it wasnt until the R9 280X 3GB that it all fit together nicely.
All games running at native 1080p with high settings and good frame rates.
One thing that was weird when first 4K TV s started showing up and normalfags asked me for advice on what GPU and system to build for it, was hat there was no real 4K card to suggest.
Before 4090 I couldnt just say
>thats top dollar most expensive shit but you get hardware that can do 4K
I refuse to converse with people that believe rendering the game at a lower resolution then upscaling it looks better than native resolution. When consoles do this we foam at the mouth but somehow literal peasantry has now become embraced in PC gaming. Actually pathetic. The only reason people delude themselves so completely is because lazy ports can't run on their systems at anything above 50fps if they don't overly rely on lowering the resolution to save the experience. Sad!
Both DLSS2 and FSR2 offer the best anti aliasing we've seen in years.
Not even close. SSAA and MSAA are objectively clear, you just don't use it because your system can't handle it. Which paints the irony of the picture because SSAA renders assets at a higher resolution then downscales them making for immaculate picture quality and sharpness while DLSS renders assets at a lower resolution then upscales them and applies post processing to blur out the flaws.
Maybe it's my gsync monitor, maybe it's the nvidia low latency i got enabled, maybe it's my almost 40 year old eyes but i never noticed any of the stuff you're talking about. I get around 100-130fps in MW2 with DLSS at 1440p. Sure, the game looks objectively worse in screenshots with dlss enabled but as long as the picture stays in motion there is no difference.
>$400 for a low tier card is healing
Are you fucking retarded?
So many people in this thread talking about pointless shit, how about you save some money and buy a decent card instead of like a 4060 or some nonsense. I'm on welfare and I have a 4080, 12900k, 32gb ram, etc. WTF is your excuse for a poorfag pc, who gives a fuck about vram
>low tier 8gb / 128bit for FOUR HUNDRED DOLLARS
why is 128 bus width so important? what does it affect and do?
Doesn't actually matter because they added more L2 cache. It's the same reason why you can use shit ram on 5800X3D, the ram speed doesn't matter because things are mostly in cache.
>xx60 series card
>$400
Gaming is dead.
Nvidia why do you have to be a memory garden gnome?
Nah
Waiting on a RX 7700X to see how compare at 100$ less than the 16GB version
>nVidia is healing.
No it's not, they just got to a point where they weren't selling any cards so they had to lower the prices, still, that GPU should cost 200€ like a 1060 costed, not 500€ like the 4060 costs, this is a literal scam and you're falling right into it.
>8gb
Lol
Lmao
Name a single game that needs more and isn't goyslop.
Name a single game that doesnt require it that isnt dogshit.
Check mate.
So, you admit that you’re checking me out and you wanna mate
Everyone wants to mate but it can only happen between a man and a woman or an asshole or a hand. And you're neither, you're a beautiful soul.
I want to buy a new card.
There are 2 choices
>3060 12gb - new
>3060 ti 8 gb - used but in good condition
The price is same for me. 1080p target.
Which one?
Depends on whether you want to fuck with AI or are worried about the AAA VRAM meme. Do you know the history of the 3060ti or are you just getting one off of ebay?
Another option would be to wait for the 4060 to drop in a week or so. It's supposed to retail for 299 burger tokens so you could probably get one for the same price as a 3060 if not cheaper.
>lower bus than my 3070ti
I'M BUSSIN
60 series is not for gaming
what do you use this for?
for AI and shit
> $400 for a 8GB card with piss poor performance in 2023
No, anon. They are not healing in the slighlest.
the "I need more than 8gb of vramfor 1080p" meme was spurred by nvidia and amd shills and pajeet programmers who can't code.
You know what's the ideal amount of VRAM for editing 4k movies? That's right: 8gb.
And do consider the fact that, in theory, games should not be as demanding as editing movies or games.
And let's also not mention that games also use the CPU's ram, which is on average 16 GB.
No matter how high fidelity your game is, if it can't run well at 1080p with 8 GB of vram, it is simply unoptimized GARBAGE
End of story
You talk alot about resolution but that's hardly the biggest culprit. In average, going for 1080 tp 1440 only increases vram usage by like 1GB, if that. It's often the massive 4096x4096 (or even higher) textures that take up most of the space.
NTA but you're exactly right, remember how uncompressed audio inflated video game file sizes? Same shit is happening with textures and it doesn't make them look better. It's placebo and it only serves to make the new graphics cards with whatever developer has the sweetest deal look good.
why would anyone still play at 1080p in 2023
Because he does not fall for marketing gimmicks and is not a consoomie
Most modern AAA releases on PC are unoptomized and older cards in the same class released with more VRAM or for far less money. Gimping vram on next gen cards isn't going to change how developers make games.
>Gimping vram on next gen cards isn't going to change how developers make games.
This. You can:
A: Complain about unoptimized garbage and don't play new triple As.
or
B: Buy a card with enough VRAM to play unoptimized garbage
Developers aren't going to improve.
xx60 cards don't need more than 8GB regardless of resolution, since the card will throttle long before VRAM does.
Remember this thing is only like 10% faster than a 3060ti on a like-for-like basis. And nvidia is trying to garden gnome you into thinking you can use it for AI. kek what a shitshow.
the 3070 and 3060ti already struggle with 8gb
No the 60 cards struggle because they aren't powerful enough to run modern unoptimised AAAslop on max settings. It has nothing to do with VRAM. It's a midrange card and people think you can max out graphics on it.
The 70 cards does need more than 8GB though, since those should be targeting 1440p resolution or higher.
>It's a midrange card and people think you can max out graphics on it.
you could if it had more more vram
Nah. Not in most cases at least. The framerate would drop too much before hitting the VRAM limit.
the 3060ti is only around 15% slower than the 3070 you fucking moron
Check videos of a modded 3070 with 16GB or the A4000 (same chip, lower clocks). The card is not that powerful for maxing modern triple As.
What if the hardware of the cards are exponentially better except VRAM but the software of the video game are worse?
>but the software of the video game are worse?
This is absolutely the truth. The only thing we can do is buy a powerful card with as little bottlenecks that money allows. Either that or cry about it later that your card can't play new unoptimized games because of the lack of VRAM or low bandwidth (lol 128bit cards in 2023). In a perfect world everyone would have the money to buy a 4090 and be happy.
I'll make a powermove and play ye-olde vidya.
>people think you can max out graphics on it.
Isn't the x060 always the way to go to max things at 1080p?
A card that could max out 1080p was a lot more impressive >6 years ago than it is in 2023.
Alright but if I only cared about 1080p would buying a x060 be good enough? The only catch that concerns me here is the vram difference that might act as a bottleneck considering how modern games are designed. Am I wrong?
It *should* be good enough. It's not that you're wrong, it's that these companies are wrong for thinking $400 for an entry level card is acceptable
>1030
>1010
These are for media playback on a setup without an igpu not for gaming
We're seeing senseless price increases across the tech industry. Be it smartphones or VR sets. The prices are only getting bigger and bigger. I share your same concerns but if I had a stable job I would probably bite the bullet since I am not the kind of person who upgrades parts often
Smartphone prices have been relatively stable and VR is a niche emergent market. I spent less than $1k total across multiple upgrades for a flagship phone over the past half decade or so.
I will never spend more than $400 (after tax) on a gpu. I just won't.
What's the best CPU/GPU combo to play in 1440p/144hz?
If you're only going to play, 6950xt. For everything else, at least a 3090.
pentium 2 with geforce2 mx 400
>Pentium 2
moron please.
>i want to spend more on a monitor than my computer, or use a dogshit tn panel
why
Imagine being obsolete right off the bat.
>he can't afford the nvidia card that can magically make 1280*720 upscaled to 1920*1080 look better than native plain ugly old 1920*1080.
>I didnt overpay for a graphics card mr Shlomostain is my ally and nvidia corporation is my family they didnt fuck me up the ass and rape my wallet
Jensen tongues my anus
>8GB
completely useless for modern AAA games, buy a PS5
Why is digital foundry like that, and why do people act like they're any kind of authority on game performance?
Because they are an authority on game performance over any other outlet or channel.
>digital gayry
>authority on anything but shilling
Remember the xbox ambient temperature of 18 degrees Celsius reviews?
You gays accuse every tech channel of shilling.
Digital gayry is notorious for being shills.
You gays accuse every tech channel of being shills.
How do you think garden gnometube gays earn an income, retard?
Thank you for proving my point.
Making some gay video to show how well a game runs is not a job and takes no effort you retard.
They get paid in advertizements and sponsorships. Corps pay them to show their products on their channel for exposure.
Why because you say so? You retards accuse all youtubers of not having a job, no online video takes effort it's all fake blah blah blah.
>but they have sponzerships!!!
I don't give a shit. You have no serious argument against their credibility, just the same tired accusations you use against every outlet and channel that covers game performance.
>Making some gay video game or making sure it runs well is not a job and takes no effort you retard.
>Making some gay video game or making sure it runs well is not a job and takes no effort you retard
Apparently it does since diversity hires can't program for shit and as it turns out don't have the skills to make a proper, functioning game.
Why would they when making a video game isn't a real job and takes no effort
Digital gayry is notorious for being shills.
Digital Foundry is notorious for triggering console fanboys by exposing their weak performance and getting accused of being shills.
I miss Totalbiscuit's videos.
The trick is realising that people WANT an authority figure to tell them what to think. They actively seek them out.
SILKY
I
L
K
Y
silky smooth as a widows web
Has this happened before in the gpu industry? The 3060 MSRP was 399 and has 12gb vram 192bit bus, I got a used one for $240. Yeah there's a decent performance improvement, at a higher cost, with other worse components and less ram
>XX60 for $400
Breh
Back in my day, the GTX 780 was $400
>60 class for over $250
At this point im certain people just like cocks in their ass.
No.
What PSU would you need for a 4060 ti?
A QUALITY 550w minimum. Any decent 650w if you want to cheapout.
>650w
I have a 650w titanium Seasonic PSU. This should be good right? My current GPU is dying, it would randomly black screen when playing any 3D AAA slop but it's started doing that while I'm just shitposting and not even gaming.
That's more than enough. The 4060ti is a 165 watt card.
>4070 is already 700€
it's never going to end, isn't it?
Nvidia doesn't care about the consumer market anymore. They earned a shitton during the great cryptocucking and now they earn a shitton on big corpo AI shenanigans. They don't need us.
>Nvidia doesn't care about the consumer market anymore.
And AMD doesn't care either. They make so much more money selling their enterprise GPUs than consumer GPUs, it's not even comparable. The difference is huge.
We're all fucked. Only Intel can save us but they somehow have worse drivers than AMD.
>only intel can save us
you mean the company who gets the Mossad to steal government IP to then use for their own chips? those guys?
ALRIGHT DUD
Why do you think they are going on about cloud gaming so much, price gamers out of owning their own hardware and push them onto another subscription service
The 4060 is only $299
The 4060 Ti 8GB is $399
The 4060 Ti 16GB is $499
Should be 100 bucks less across the board and the 16GB version costing 100 more than the 8GB version is absolute giga garden gnomery.
Nahhhh gonna get a 6700 xt this build, in another ten years when i build another pc maybe the rtx 5xxx will be better
It should be half that and retailers are going to make it 500
You don't need 1080p goy, dlss is fine to get performance, a base resolution of 640x360 is fiiiiine - why would you need more!?
I expect this to perform worse than a 3070
I bet it will be on par at lower resolutiosn, but slightly slower at higher resolution. Exactly how the 4070 compares against the 3080.
holy fuck people are going to make so many ai pics with this thing
Ok. What's the best AMD alternative then?
There isn't one yet. They have no idea how to fix their RDNA3 crap. It's full of bugs and high power draw. This is why they're taking so long to release more cards.
Low end 8GB card for 400$ (closer to 700€ Europe).
No, it's fucking over.
>closer to 700€ Europe
What? The 4070 goes for 650-670€
It starts at 900euros lil bro
That's the Ti.
My amazon
Dumb anon has been looking at the Ti prices all this time kek
My bad, you're right
4070 is so cheap compared to 4080
The 4070 is such a great gpu Nvidia is putting a temporary halt on production.
> slower than a 3080
> great
Huh...
I think you missed the point
>8gb
THANK YOU BASED NVIDA
jesus christ, the world really did end in 2012
is there a similar chart for amd?
Not him but AyyMD didn't have high end or even xx70 equivalent cards for many generations generations.
What generations are those then, you retarded shill?
Tesla/TeraScale - GTX 275/HD 4890
Fermi/TeraScale 2 - GTX 470/HD 5850
Fermi/TeraScale 3 - GTX 570/HD 6970
Kepler/GCN 1 - GTX 670/HD 7970
Kepler/GCN 1 - GTX 770/R9 280X
Maxwell/GCN 2 - GTX 970/R9 390
Pascal/GCN 5 - GTX 1070/RX Vega 56
Turing/RDNA 1 - RTX 2070/RX 5700 XT
Ampere/RDNA 2 - RTX 3070/RX 6800
Incidentally, the only card in that entire list on the AMD side that was their top end offering that generation is the RX 5700 XT, since RDNA 1 never got a big GPU. Can't wait to hear about the "many generations (generations)" I missed out on which have also been scrubbed from the internet.
Blown. The fuck. OUT.
>mfw I remember how fucking dominating the HD 4 and 5 series were
The 7000s were nice, but were the start of AMD really pumping dem volts into the high end to stay competitive pre-Pascal, a Sapphire 7970 is probably the worst card I've ever owned in terms of reliability. The button on the card let you switch between
>really high stock overvolt
Or
>VRM toastingly retarded overvolt with 8% OC
I had to bake that fucker back to life multiple times, which sucks, because that card was a fucking beast. And then Catalyst transitioned to Crimson, and after a bunch of really bad AMD replacements, my Fury X became a 1080 ti
It tickles me to see the big boi tahiti chip on the list twice.
I am forgotten
We seriously no shit need HBM on future GPUs if we wanna solve the VRAM/high resolution crisis. Fury was a terrible showcase of HBM, but the HBM itself let it punch a lot higher than it should've, speed, bandwidth, and timings are insane, but 4 FUCKING GIGS ON THE CARD THAT WAS SUPPOSED TO SHOWCASE HBM. I couldn't run Mankind Divided at all on that thing without turning everything down to very noticeable medium-high compromise settings, and it also released during the move to Crimson, games just didn't fucking work, the control panel never applied settings. The 400s and 500s ran circles around it just because they had 8 and 12GB, 4GB sealed it's fate, and Nvidia hoards it for data center and AI bullshit
>Ampere/RDNA 2 - RTX 3070/RX 6800
6700xt is enough to BTFO 3070.
At this point i would wait for Battlemage. Intel At least is going to try and give a fuck.
>Dude you NEED to have 12GB minimum to play these trash ports with bloated hardware requirements and shit tier optimization!
Here's the rub amigo, if you are playing the "well optimised" games you don't need a 4060 or 4070 either so you are getting fucked twice.
>the 4060Ti 16gig is only 60 bucks cheaper than the 4070 in my country
Ok Nvidiarrhea
Was considering getting the 4060 ti 8gb instead of the 4070 because of how much cheaper it will be but the performance gap will probably be way bigger than 3060 ti vs 3070
I don't really play new AAA games that need a lot of vram but would like to max out some older titles at 1440p while getting over 100 fps
Hey guys I got 4070ti for 4K is it good? It was 900 euros
For AAA trash only with DLSS enabled, otherwise yes.
Sorry but not really, honestly.
I don't see why you'd pay so much more for the same amount of ram as the base 4070.
If you were gonna spend close to 1000 bucks on a card you should just go for the 4080 honestly.
>4070ti for 4K
Don't do it bro.
>Useless RTX crap
not paying for that. give me actual GPU power, why would I waste my money on gimmicks I won't use most of the time? RTX should be an optional thing and the RTX chip should be a general thing that provides power for all games even without RTX.
AMD card atleast have more power for texture and shit, instead of wasting it on raytracing.
Reminder: 4090.
lel, keep buying recycled crap with gaytracing shovelware
>RTX 3060
>12 GB
>RTX 4060
>8 GB
lol terrible decision making, NVIDiA are fucking retarded
Why is it terrible decision?
gays will still buy those.
Would have sold much more by adding a little VRAM. terrible decision making
But then they would have to raise the price making those cards DOA even more than they already are.
3060 also has 8GB version.
But adding 6 gb ram is not expensive. VRAM is one of the most important people characterisistics that plebs pay attention to when buying cards, they will absolutely buy a card over another one, or not buy it because becaus of few gigs of vram.
If olds cards have more VRAM they wont buy this one. this deal is not better than used 3060 12 GB. if your new cards is not better than buying old used shit then its not a good product
The only other configuration possible for 4060 would be 16GB and as you can see nV is asking steep price for additional 8GB of VRAM.
4060 is 1080p card at best, low end gamers don't have high standards but they care about the price more.
what do you mean low end, this is not a low end card, it runs the games. 400 may be low end for crazy prices but for people 400$ is still a lot because they dont live in a GPU world with new gpu prices, they live in a real world where they pay their bills. people who are not willing to spend a lot of more money but tech savvy enough to build their PC and buy a single GPU are actually quite picky and try to go for the best gpu power per dollar.
4060 is low end card, 4050 whenever it will launch will be entry level GPU for this generation, just because we don't get many $250 GPUs anymore doesn't change that.
Buying low end hardware is stupid even if you are poor because those GPUs age poorly.
If you can't afford $400 every 3-4 years you need to stop playing games and invest in yourself.
Gaming as a hobby is very cheap it's the main reason why it's so popular.
I'm lucky to be youropoor, I can just move to another country whenever I like.
>Buying low end hardware is stupid even if you are poor because those GPUs age poorly
Tell that to my GTX950 that still plays everything I want to play.
>everything I want to play.
This is coping mechanism, I know this well. Once you taste freedom you won't be able to go back.
I used to raid in wow with sub-10FPS, I would consider this now unplayable.
You actually are a paid shill, aren't you. Nobody can be this stupid. Go compare Vanilla WoW spec requirements to Classic. It's a fucking joke.
What are you even talking about?
I'm just saying that when I had shit hardware and I were younger I could accept shit gaming experience and now I would not enjoy that even a bit.
but we do, rx 6600 is a 3060 equivalent and costs 240 bucks new + a free game (right now it is sold with the last of us 1), in NEW condition of course, in France (so not a cheap country), probably even less in the US
nGreedia has near monopoly, many gays don't even know AyyMD sells GPUs.
Gaming laptops are also very popular among normalfags and those very rarely come with Radeon as well.
But yea 6600 is good enough if you enjoy sub-cucksole experience.
>Buying low end hardware is stupid even if you are poor because those GPUs age poorly.
Once upon time there were sweet spot GPUs in every gen usually around mid tier.
Like the reasonable priced option that performs ok.
It was 1060 6GB, RX 570 4GB, 750ti, R9 270X, HD5750 (same chip as 5770) 8800gt 512MB..
Yes that was the case. You can still play 95%+ of the games on older hardware. I would remove 750Ti from your list it was trash from the beginning.
It's just games are made to run on cucksoles first, plus any corporation just cares about profit. So if customers can keep their hardware for long they miss out on sales.
If you are fine with 1080p 60FPS you don't need to upgrade often.
I'm just elitist prick, that's all.
750ti was real cheap used and performed well over what people paid.
Like if you could buy 1050ti at $50 or 3050 at $100
It came so late it was super slow card, 9xx series was launching a few months later. I know nV skipped 8xx cards on desktop.
It was poor value, used market is separate thing because it varies drastically from country to country. Even now retards are trying to sell 750Ti's close to the price of rx570.
Theres millions upon millions of them for a reason.
it was literally a launch card for the Maxwell series
its best strength was requiring no auxiliary GPU power and enabling normies to slot it into any kind of existing or second hand Walmart tier PC and get a comparable experience to consoles
Every 40 series card except maybe the 4090 is selling like shit and nvidia is starting to feel the pinch as shown by the very slight price drops and slowing of production. Everybody just has to keep on being patient.
lol, they gonna raise prices on 970 x10 when 5090 is goes out, keep feeding those assholes.
AMD needs to pick up the slack and bring some more competition for everyone's benefit.
The stunt they pulled with the RX 580 and Hogwarts Legacy has ensured they will never get my money again. Intel is my only hope.
>8gb
>400 dollerydoos
>can't even run RE4 remake at 1080p maxed out
LMAO the only thing nVidia is healing is its wallet you daft retards
It's for pleb 1080p, at that point just get a fucking console or laptop it's all the same shit
>at that point just get a fucking console or laptop it's all the same shit
Are you zoomer or something? Even worst GPU is better than console trash.
If you need a cheap GPU, you can get a new RX 6600 (RTX 3060 equivalent) for 250 bucks. Don't waste your money with garden gnomevidia.
bought a 6800xt and waitfagging for 5000 lineu. this generation is a wash, just save-up a $1000-$1200 for next gen. in a year or two get 2x32GB DDR5, 14th gen intel CPU+mobo, 4th gen NVME and your good until 2029.
figured I'd ask here.
I've been using a nvidia 1660 TI for ages and it's really showing its age at this point. Are there any graphics cards that are decently priced that I should aim for? I do like to play modern releases.
What resolution and framerate do you want? Do you care about raytracing and DLSS?
1080p and a nice constant 60fps, raytracing would be nice yeah, DLSS I could take or leave.
Just get a 4060 then. Although if you also have an ancient CPU you might want to consider an upgrade on that too so it doesn't bottleneck your card.
>1080p60fps
Is 1660 not enough for that? I used 970 for 100 fps just few years ago.
rx 6600, best cost per frame
alright, looking into those, thanks for the help anons.
I'm about to pull the trigger, but need to know: 4070 for 1440p for the next 5 years. Too much or just enough?
IMO any form of upscaling looks horrible on 1440p, supposedly it's better on 4K. I would look at native performance numbers and keep that in mind.
I think it's enough if you don't plan on playing the latest AAA games at max settings for the next 5 years because of the vram meme. If you're targeting 60 fps then yes, it's definitely good enough maybe even slightly overkill but you'll be future proof.
I'm personally considering 4070 for 1440p 144hz on older games so vram isn't much of a concern. I mean it will definitely run new future games but without ray tracing or some extreme settings.
My bigger concern is how to go about my old A320M-A motherboard and cpu. I refuse to buy new motherboard chipsets and DDR5 memory.
>4060 ti (8GB)
>4060 ti (16GB)
Which one should I pick?
Neither
have a nice day Nvidia marketeer
the only thing that is "healing" is your gaping asshole from all the anal fisting they gave you throughout the years
>developers are lazy and openly admit how they operate under the philosophy of just brute forcing their badly made games through
>they blame the hardware when they can't brute force
Of course they're both fucking the customer. There's no reason for Nvidia to garden gnome out on VRAM because it's cheap as dirt for them. Developers need to LEARN TO CODE and stop using Tim's hekkin' chonker shit Unreal Engine for everything. Also stop adding polygons to models, you've added enough.
Dude, 2070 was 499
There is no healing, it's FUBAR
>4060
>8gb
>128 bit bus
>$400
>healing
Nvidia shills are truly retarded subhumans. It's even more funny when they ignore the fact that there are GPUs out there right now with 16GB for less than $400 like Intel Arc A770. If you aim for budget and still pay more for less you are by definition a retarded subhuman.
I have a 6gb 2060
Would it be worth to get a 4060? The fact that it draws less power than my 2060 and that it's almost double the power seems interesting
The only big game I play ln my 2060 is modded Skyrim and I have roughly 45 fps which isn't bad, other than that I mostly play older games from 2010s
However the noise on my 2060 is bothering me when it's under load, so I had to put the fan to 45% under load and undervolted it resulting in a roughly 5-10% power loss but still...
At this point a 3060 would be a better idea.
I would like something kinda future proof that will work for the next 5 years
Might just wait for the rtx 5 series at this point, my 2060 is still going strong...
Battlemage next year with Intel then. They are actually going to compete for once.
>>RTX 4060
>>$399
>it's just a 4030 with a new name
next
>$400 for a fucking low tier GPU
are people fucking insane? I just paid like 550 for a 6800XT with 16GB. All for this RTX bullshit? Eat my ass, there's pretty much no games worth playing that take advantage of this unless you're a AAA goyslop consoomer.
I bet the 4060 regular version will barely outperform the 3060TI while being more expensive.
In fairness, there's a good reason why Nvidia can get away with charging this shit, since their GPUs are also now used for AI.
Plus, there's the fact that they know they have consumers by the balls in terms of overall power efficiency, brand reputation, better drivers than AyyMD, and software. DLSS as an upscaling solution is so much better than FSR its not funny.
sucks to be you tech illiterate
pay the retard tax for the chinks
Had a 1070 for almost 6 years before upgrading to my 4090, definitely worth it
>went full retard
congratulations
>gtx1630
>$150
>rx 6400
>$140
this is fucked. you can get 2013 cards for free/10 bucks that have the same performance
the morons just don't care about the low end anymore
Fuck, 4080 is still not dropping price.
Guess I will have to wait for the 50xx series in the next 1.5 years.
fuck the jannies
didn't realize there's already a graphic cards thread, crossposting for help
>worse than an intel a380
Oh no no no no
Is it true that I need a good CPU if I'm aiming for high fps, and a good GPU for high resolutions?
So that means I should just get a 4070 but a i9 with it?
$499 aint bad if it can outperform my 2080 ti
It's trash if an Intel A770 is better at a cheaper cost
Get a 3090 Ti or a 4090, otherwise it's pointless to upgrade. Or even better, wait for the 5090/5080 Ti.
3090ti is very bad buy unless you get it for bargain. 4090 came out so fast after that power hog and made it completely obsolete value wise
Will there be a 4080 Ti? Or a 4090 Ti? Is this series officially over?
>8GB
>16GB
lmao I'll just get an RX 7800 XT
> Worse than a 970 at 10x the price
lol
How so? 970 was a card that could barely run Far Cry 3 on ultra in 1080p 60 fps. Your standards have just changed. You want everything to be so much more. Touch grass.
How do I learn to recognize what computer parts are worth buying and what is outdated/trash/overpriced?
Learn about basics of computing, learn about how programs are compiled how they are executed, then learn a bit about CPUs and memory, then find out what you really like doing with a computer.
Like if you really enjoy some game, find out in what engine it was made, what kind of system requirements it has, find it running on shittiest lowest setting, find it running at the best possible imaginable settings and hardware.
Watch a few videos about monitors and how they work and about their resolutions and refresh rates.
So after your understanding of everything broadens and deepens you can understand each component better.
I was lucky my favorite game was Quake 3, and I could even look into the source code and change random texture files and sound files and do whatever I wanted with it.
Biulding some poverty gaming computer can also help you heaps.
Like buy some shitty used office computer wiht some old i5 like the i5 4570, then add some old gpu into it, change the PSU, add more ram, add an ssd, install an os.
Like waste $250 to $300 tops to give yourself a learning experience that can only come from real world practice that you do yourself.
who cares about anything but Vram these days? People want AI whores sucking their dick, they don't care about raytracing.
>Gimped VRAM at 270GB/s when 2060 and 3060 had 320GB/s+
Can you even tell the difference? I got 3070 Ti which supposedly has very fast VRAM speeds but I can still easily tell when it's loading in the textures for the first time.
R9 390 had 384 GB/s bandwith with 8gb of vram in 2015. That thing was 330 bucks brand new. We are evolving backwards.....
It's because we've reached the limits of what they can do with current designs and nobody has any idea of what to do next. At least AMD has chiplets, vcache and infinity fabric to play with. It's almost time to do another unified pipeline and mix shaders with AI.
1080 Ti had 354Gb/s and 352bit. The 4080 has a 256bit bus.
I still think to this day the 1080ti was a mistake, they probably never intended to release such a powerful card for it's price
>128bit memory bandwidth
Enjoy those stutters OP.
That's pretty decent
Wooaaaahhh!!!!! I might upgrade to a rx580 after this earth shattering news. You cucksoomers really are something special.
>morons want the top tier GPUs at 200 dollars to play their goyslop AAA kusoge
>in this inflation
you'll never EVER get entry level GPUs for cheap ever again
it's fucking OVER
>Muh inflation
>Things are more expensive...just because they are ok!!!!
It's called price gouging and you are gleefully going along with it
I'm not gleefully going along with anything, I'm a 1060 at 1080p chad and I'm perfectly fine with this turn of events
>things are more expensive because they are
yes
you think you're being smart with that clever "haha gotcha" but that is literally how things work
Nvidia can release cards without the RT gimmick easily for cheap but they refuse to do it.
explain to me why can't computer works like this?
Combine 1070 + 2070 + 3070 = greater power than 4070
>Just pay $400 for it goyim
AHAHAHAHHAHAHAAA
Diminishing returns with SLI and requirements of rendering frames in real time and needing the data from the last frame to render the next one.
SLI and crossfire never really took off.
>low gpu
>500€ + scam prices due to shortages
no it isn't healing shill
You fags will complain about anything
4060 Ti is the same MSRP as the 3060 Ti and has a decent performance bump, or a massive one if you are ok with DLSS3. Keep in mind the 3060 Ti was released before Bidenflation really kicked into gear. Nvidia is fighting tooth and nail to keep GPU prices low and it seems only I'm smart enough to see it.
If it's that bad, then they should lengthen the release cycles, and pester devs to stop being so shit at optimization.
>You fags will complain about anything
No shit. Us fags complaining is literally the first and last line of defense against your corporate overlords.
Oh Anon. Don't you know that the multi-billion dollar globalist corporation is his replacement for God?
there is nothing wrong with these GPUs as long as they're priced appropriately
Imagine it was
$199 4050 Ti 8GB
$249 4060 8GB
$299 4060 16GB
instead of
$299 "4060" 8GB
$399 "4060 Ti" 8GB
$499 "4060 Ti" 16GB
Would anyone actually complain?
a little more expensive than i would like but no
SLI was incredibly janky shit to get to work, and even then it didn't work a lot of the time from what I recall. There's a reason gamers gave up on it.
1080p card is not for PC gaming
It’s for console gaming on a PC
>8GB
ngmi
imagine getting mogged by a 3060 12gb
vramlets will never learn
Can't wait for Intel Battlemage B770 24GB to curb stomp Nvidia shills into the fucking ground.
you have no drivers
you will never be a real graphics card
>you have no drivers
Literally gaining more FPS with each update running anything I want flawlessly.
You on the other hand have no drivers, no vram and no video you shit faced Nvidia shill suck my veiny unwashed penis with left over smegma clean.
>$400
>for a gpu that can't even do 1440p because of the crippling narrow bus
Lmao. Not a chance in hell.
>500 USD for the model that isn't obsolete by design with gimped VRAM
Bravo, nvigarden gnome, I will surely upgrade now.
Yeah, no, fuck off you cock smoking gay.
>memory bus as narrow as Nvidia CEO's micropenis
>the cheaper model has the same amount of VRAM as my GPU from fucking 2014, almost literally ten years later
>the model with somewhat acceptable VRAM for 2023 is the price of a midrange card while touting itself as an entry "budget" model
>the cheaper model has the same amount of VRAM as my GPU from fucking 2014, almost literally ten years later
Same amount, different type. GDDR6 is much faster than GDDR5 or GDDR5X. Not disagreeing that the current generation of Nvidia cards is a blatant scam (other than the 4090, which is fantastic but massively overpriced), but 8GB of VRAM from 2014 is a lot slower than 8GB of VRAM from 2023.
not the other guy but
if he's talking about an 8GB 2014 GPU, I'm assuming he's referring to the r9 290X 8GB, which had a bandwidth of 320 GB/s
4060 Ti has a bandwidth of 288 GB/s, 4060 272 GB/s
>rebranded 4050
>128 bit
Bold move by Nvidia garden gnomes almost if they want to trickster trick the budget NPC into buying it.
my 3060 has more vram than the 4060.......
Ah yes the new garden gnome liminal degrade gpus.
Nvidia wants you to downgrade to their dlss 3 blurry interpolated crap to use less memory so the gpus can mine secretly in the background on special hidden chips in the management engine
kek
RTX 4040 4GB 64 bit
RTX 4030 2GB 32 bit
RTX 4020 0.5GB 16 bit starting at 899
Rtx 4010 1gb 8bit
RTX 4010 0.125GB 8bit starting at 999 but only if yu pre-order our custom dlss 5.0 package (works only on RTX 4010)
Pfft nit havin a rtz 4910 1mb with a no byte bus what thr fuck
the next time I'm upgrading I'm buying a 90
I went from a 1070 to a 3070 and it's just not good enough
RTX 4000 - 0.064GB 4bit (previously 4010ti rebranded)
My RX 6800 does everything I want and offers great performance with 16gb of vram
I went back to my 2080ti i just didnt need the extra vram yet and frs picture quality was ass. As well as all the other issues
Can relate, tried AMD but something just felt off, a general degradation of image quality even in 2d despite using the same monitor and cables.
It's really true, Nvidia is unbeatable in all aspects.
Actually I went back to my 6800 because it’s >10% faster than the 2080ti.
Rtx 5090 will be the same speed as the 4090 but twice as expensive nut has dlss4 which gives you a 4x boost in fps and res over dlss 2 but only up to a 2x boost over dlss 3
5090 will just be a consumer AI card.
I dont even known why normies bother upgrading anymore anyway but lets pontificate on the card.
Will be even better at ai rt etc but barely better in raster as they eill force new path trace shit to gimp their older gpus now that everything is good enough in raster even without upscaling interpolation shit
500 bucks (650€) for a card that's probably fast as what? 2080? Fast as a card from 2018? 5 years ago? For 500/650e?
It's DOGSHIT
Its slower than my 2080ti stock which is slow as balls and costs almost as much (barely anyone bought one new so they got dumped on the used market cheap as shit either side of 2019-2022 for peanuts never seen them go above 300usd)
NVIDIA locking dlss3+ to newer cards just to gimp the older ones is disgusting. Intel and amd will obviously follow suite with frs and xess
4080 should be 499 by now. Just because they pulled their cock out by a half inch doesn't mean they aren't still fucking you.
It's utter bullshit.
If the 4070 TI can match the performance of a 3090, then the 4060 TI should be equivalent to a 3080, not a 3070. It even costs the same as a 3070 at original retail price. And why, WHY didn't the 4070 series have 16GB of VRAM?
128bit for 4060ti 16gb xDDDD holy fuck thats so fucked up. the 3060 12gb was 192bit and 3060 8gb 256bt nvidia fucking us up badly.
N
I
Nazis BTFO
You don't need minimum 400$ gpu.
You don't need boring unoptomized 3a trash.
G
The fuck does the memory speed even realistically matter. If there is texture loading in then you will see it regardless of the speed. This is like those morons comparing 800:1000 contrast to 700:1000. You can't tell the difference anyway.