id say its more like a 3090 non Ti with way less power usage. The "3 times better than 3090 Ti" thing is bullshit lifted up by Frame Generation which is a nice feature but has issues . maybe with time it will get better
okay, so I googled around, and apparently the "issue" with this GPU is that there was supposes to be two 4080s one with 16gb of vram and the other with 12gb.
So people are mad the 4080 12gb got renamed to 4070ti.
what a fricking waste of time.
>okay, so I googled around, and apparently the "issue" with this G
it wasnt just VRAM. the chip was also lesser. The 4080 16GB uses AD103, 4080 12GB uses AD104. Lower number is better (3090/Ti and 4090 use 102). 4080 12GB/4070 Ti also has slower bandwidth and way less cuda cores. it is literally not a 80 class card in any way but nvidia tried to sell it as one. its not just the vram
>so what is the problem it got renamed to a lower tier like it should.
people were mad they were calling it an 80 and it was $900 USD. now they're mad about the price (and still the gimped bandwidth). if you watch youtube benchmark videos its a decent card even at 4K but it should cost like $600
1 year ago
Anonymous
I looked at reviews and at 1440p it matches even a 3090ti sometimes.
this is like saying we want a brand new 3090ti for $600., I mean I get you want cheap GPUs, I do too but dude, this is being unreasonable, it would cannibalize any sales of previous 3000 series gpus still in stock.
1 year ago
Anonymous
>this is like saying we want a brand new 3090ti for $600.
precisely > it would cannibalize any sales of previous 3000 series gpus still in stock.
good
also i wonder how the 192-bit bus will hold up in the future. some say the increased cache make the slow bus matter less but idk
1 year ago
Anonymous
>I mean I get you want cheap GPUs, I do too but dude, this is being unreasonable
xx70 cards were the $350 card actually the 970 was the $329msrp card. even inflation doesnt account for the greed NV shill. keep eating their shit and smiling im sure its good for you
Because it's true homosexual.
Here is why, 3080 came out two years ago as nearly top tier GPU for $699 now we get 20% faster GPU for $799.
4070Ti is not even half of 4090. >but crypto >muh inflation
WHO CARES.
No progress was made other than lower power draw.
price/performance is starting to become a flawed metric anyway, at some point it doesn't matter how good the price performance is if the nominal value of the gpu is extremely high
we're at the point where new 'midrange' gpus are more than half of the cost of a pc build
>and too low of performance for high tier gaming
homie what the frick are you talking about, I've built multiple 4070TI pcs for my job and they ultra games in the 100's with raytracing on at 1440p
>mid tier gaming
What is this moronic shit man? Just because it has a 70 at the end doesn't mean it's mid tier, especially when it's trading blows with much higher numbered cards of the previous gen at 1440p
the only real problem is the name/price but for someone who would have been in the market for a 3080/3080ti/3080 12gb the 4070TI is pretty much a slight upgrade for the same price
basically the 4070 TI makes the 3080 cards a bad deal unless you can get them below msrp
personally I don't like buying used hardware, especially not when there's miners trying to offload their gpus
it's not even about being worried about buying a burnt out gpu, I just don't want to give the people who drove up the prices of gpus any money
It's a rebranded gimped 4080 at a ridiculous price that can be outclassed by the 3090ti and several AMD offerings. Remember the awful cancelled 4080 12GB edition? The 4070TI is the EXACT SAME card released under a different name to try to skirt the outrage.
PS5 has 8GB RAM spread accross VRAM and system memory. Besides I don't really understand why people see it as a big deal to not be able to run 4k ultra textures in 0.1% of the games.
I'm the same. Upgrades my 970 to the 3060ti after having my 970 since like 2014
Now my CPU, i5-8600 is the bottleneck and needs replacing. Problem is every time I upgrade my CPU I need a new mobo, so this shit is expensive
I'm upgrading from a 980 to a 3070, what should I do with my older card now? I doubt anyone would want a card that's been regularly used for seven years.
>sell it >or put it in a backup build >make a backup build that also doubles as a media server or HTPC >if you do make a backup build, use the 980 to test recent games and make youtube videos so poorgays can see how low the FPS will be on the latest AAA goyslop on their poorgay rigs >keep it in storage just in case something ever happens to your 3070
we all know nvidia is going to wienerblock their 3xxx users just like they wienerblock their older gen from new features so better pony up for the 4xxx series like a good goy
>what new features?
raytracing
dlss 3 frame generation
they may be meme features, but they are still features. not worth the money imo but features nonetheless
It depends on graphics settings and the game.
You aren't playing Cyberpunk at native 4k with ray tracing on at consistent 60 fps even with a 4090.
1 year ago
Anonymous
so a cope. a blurry shit cope. dlss only seems nice in marketing, trying it out i was disgusted.
1 year ago
Anonymous
t. 1080p player
Looks good on my 4k monitor.
1 year ago
Anonymous
wrong, but you do you. dlss looks like shit to me.
1 year ago
Anonymous
dlss is fine and looks fine. the problem is devs using it (and fsr) as a crutch. games should run well as intended without those two being necessary
1 year ago
Anonymous
I will still disagree. I only tested it on Control and Cyberpunk, on a 4k screen, and I still deferred to either lower settings or resolution because it just felt wrong.
1 year ago
Anonymous
that's funny because a 4090 cannot even keep a consistent 60 fps in dead space without dlss
1 year ago
Anonymous
No dipshit. Its so you can get 60fps on your $1600 gpu when you turn on RT
>minimal image quality loss
I see it and don't like it. Call it placebo or looking for issues but I don't like it. If you don't mind whatever.
1 year ago
Anonymous
Here you go, anon, which half is native 4K and which half is DLSS Performance (internal 1080p).
This should be really easy since there are a few specific details here that DLSS handles very poorly.
1 year ago
Anonymous
not that anon but wouldn't it make more sense to have the same part of the image on both sides ?
1 year ago
Anonymous
It should be easy either way.
1 year ago
Anonymous
now you switches the sides, native is right, upscaled is left
1 year ago
Anonymous
Correct
1 year ago
Anonymous
NTA but the right is obviously the upscaled one.
1 year ago
Anonymous
>a picture of downscaled render
good going moron shill make more of these while dlss still looks like shit in motion to anyone with a pair of eyes
1 year ago
Anonymous
>Downscaled render
Anon...
1 year ago
Anonymous
oh im sorry it's a downscaled high settings image that's upscaled to your resolution. how is this desirable? and i don't care for your answer because if it works for you fine. i don't like it.
1 year ago
Anonymous
I think it looks good in motion. There are certain artifacting problems in 4k native that DLSS solves (though introducing blurring on movement of certain objects.) The yellow curb dots make it obvious which one is up-scaled, but are horrendously unstable in motion at native 4k, but not with DLSS on.
I think ray tracing looks great in Cyberpunk, and would rather have that on and 70 fps versus having it on at native 4K and getting a little less than 30.
1 year ago
Anonymous
>the graphics are akshuly unstable in native
You sure your brain hasn't just gotten used to the glitches caused by the DLSS?..
1 year ago
Anonymous
All graphics engines feature some kind of artifacting in their rendering.
Unstable thin lines and other problems related to aliasing are a common one and DLSS (or its non-down scaling implementation DLAA) resolve that much better.
In some games I would rather use DLSS at quality (internal 1440p) even if I don't need the performance boost because TAA is such shit.
1 year ago
Anonymous
>All graphics engines feature some kind of artifacting in their rendering.
holy fricking cope
DLSS is good if you use it with DSR because then it's like good anti-aliasing which is something that DX12 games don't have. But yes anyone who uses DLSS on native resolution and then says "lol no difference" is fricking blind and wasted their money on a monitor where they cannot tell the difference anyway.
>crying about a 4070 Ti
b***h I have an RTX 3060 and I never even fully utilize it. Most good games are made by indie studios now and honestly I regret upgrading from my GTX 1060.
lads, give it to me straight
I have a MSI 1070 8GB right now and I want to upgrade to something that's better suited for AI generated cooms What should I get?
so what's the difference between 3060 and ti when it comes to ai stuff as well as blender?
yes 3060 has more ram but ti is faster
who comes out on top? does 8gb just bottleneck ti ?
All the benchmarks for games at least show the extra speed of the ti gives better performance than the extra VRAM
Maybe if you're playing skyrim and running 8k texture mods the VRAM is better
I can't speak for Blender, but for SD 3060 allows you to train models more easily and prompt higher-res images, while the 3060 Ti prompts faster
Though I hear with LoRA you can train with 6GB, you're better off asking the folks at Ganker for more info
i dont know why you guys keep saying the 4070 ti sucks for 4K when youtube videos show it runs fine unless you turn on memetracing and even then it performs about a 3090 level. the only question is how long it will be able to do that as games get more demanding. for current games it seems fine at 4K
Because you don't fricking change your GPU every gen unless you are moronic or have to much money. 12GB is insult on $800 GPU you will struggle with that VRAM to run games at 1440p soon. RT will become norm if you like it or not. People cry about it but this is price of progress. >just lower settings >800 fricking dollars
/rant
Have a nice day.
>Because you don't fricking change your GPU every gen unless you are moronic or have to much money.
im not suggesting anyone do that. plus not everyone has a 30 series or RX6000 GPU. >you will struggle with that VRAM to run games at 1440p soon.
doubt >RT will become norm if you like it or not.
turn it off. boom problem solved
>doubt
You underestimate the incompetence of devs. When 3070 came out people were saying the same, one year later you had already VRAM issues with D00M E at 1440p or Resident Evil. It wasn't severe to the point of unplayable but we are getting there. >turn it off. boom problem solved
Games will look worse that on PS3 without it. See above, lazy devs.
Flopspoken is the best example, game looks like shit and still requires high end hardware.
But then again there is like one or two AAA games worth playing each year.
>Games will look worse that on PS3 without it.
doubt. raytracing isnt the problem in forspoken . if it was turning it off would improve performance. the game is just poorly made overall
I didn't make myself clear. Once RT becomes more mainstream there won't be nearly as many backup solutions implemented in games. Things like SSAO, SSR, pre-baked shadows, probe lights etc.
RT just takes less time to implement than traditional methods.
UE5 uses different RT implementation than straight RTX so we will see how it goes, because it will become go-to engine soon.
1 year ago
Anonymous
might be true, but they will have the problem on their hands
since the most popular and average card in a year is going to be 3060 ti
can they afford to lose such a massive percentage of potentional buyers?
we'll see
1 year ago
Anonymous
As long as it will run on PS5, SeX they won't hold back.
SeS is already struggling with RAM capacity tho, this is why DeadSpace Remake doesn't have RT on that console. >since the most popular and average card in a year is going to be 3060 ti
It won't, so far 1650S is the most common, according to gaben but it accounts both desktop and mobile. I would rather say 3060.
the lighting in forspoken looks so hilariously bad, you'd think it'd be because they had dynamic lighting but nope, it's all baked in, just baked in wrong
there's not even a day night cycle
it's no wonder UE5 is picking up steam because some devs just can't get shit done with in house engines
1 year ago
Anonymous
that's what happens when you let amd sponsor a game and force the devs to include all their meme "features" instead of nvidia
1 year ago
Anonymous
not gonna lie amd sponsored titles are always clusterfricks and I'm surprised they don't get called out for it more
1 year ago
Anonymous
didn't they sponsor callisto protocol? kek
1 year ago
Anonymous
yea they did
they sponsor borderlands as well and those games are so bizarrely heavy I just don't understand it, and it's not like the games are pretty or the worlds are very interactive either
We can't have those, sorry.
I wasn't talking only about graphics, can we get good destruction or interaction with environment? Or fun dialogue system with not so rigid respones? Better AI maybe?
no because there's too many dumb and cheap morons on PC now >nooo I don't want to buy an nvme SSD for $100, I want to use my 7200 rpm HDD from 8 years ago >nooo I don't want to upgrade to 32gb ram, I'm still using 8gb of ddr3 ram >nooo I don't want to upgrade to modern 6 or 8 core cpu because I've been memed into thinking the i5 2500k is the greatest cpu of all time and anything it can't handle is SHIT
those people are worse than consolegays, at least consolegays
I mean for fricks sake, you can get a solid pc minus the overpriced gpu like $600-700 and then slap in an old gpu until prices come down but people can't save up $700 to upgrade from their old dog shit
I'd love for one decent sized AA studio to make a PC game that only runs on modern PC hardware and takes advantage of it to do interesting stuff with enemy AI and physics.
This homie really spent 1000 bucks on a GPU that is outperformed tremendously by a 7 year old GPU kek. Why didn't you just wait, there was no way those prices were sustainable.
That isn't exactly showing the 4070 ti to be a good 4k card considering my 6800 xt runs the game at the same exact FPS.
[...]
DLSS is off there
>That isn't exactly showing the 4070 ti to be a good 4k card considering my 6800 xt runs the game at the same exact FPS.
the anon im replying to said the 4070 Ti gets like 20 FPS in cyberpunk 4K without DLSS. he's wrong
So is RTX.
And anyone knows only RTX is causing issues in cyber troony at 4k.
We don't even have overdrive mode patch yet.
Ultra settings are meme tho.
You either have not idea what that means or you are baiting me.
CPU bottleneck would mean all those GPUs would have near identical performance I skip over vendor specific driver overhead for certain scenario, because CPU will heavily limit their performance. So for example you can run CP77 with settings where CPU will be limiting performance but even with 4090 you will be GPU limited on 12900k if you run that game at 4k native with RT on. So like I said depends on game and settings.
1 year ago
Anonymous
You can watch their review for the 7900 cards and hear it directly from the man himself. Or you can sperg out on Ganker.
I think you'll choose the latter, you seem autistic.
1 year ago
Anonymous
Why not give timestamped link to review then?
1 year ago
Anonymous
Because I don't remember exactly where it is, and I don't particularly feel like looking for it. So you can just believe whatever you want to believe.
1 year ago
Anonymous
You are just making shit up, is what I believe in.
Random 4k chart from 7900XTX GN review proves that.
1 year ago
Anonymous
Bottlenecking happens at 1080p, but okay bro.
1 year ago
Anonymous
Slowly getting what I was saying all along?
1 year ago
Anonymous
I didn't bother to read your autistic sperging out.
1 year ago
Anonymous
Cool, now go back to watching tictock.
1 year ago
Anonymous
I guess telling people to go back to preddit got old.
1 year ago
Anonymous
Nvidia suffers more because they have a lot more driver overhead currently
1 year ago
Anonymous
Too bad only HU tests those edge case scenarios. We could find some other strange behaviors.
1 year ago
Anonymous
Its strange because it was always the other way around and people still shit on AMDs drivers despite them being pretty good the last couple of years at least.
1 year ago
Anonymous
Yea, it's still is the case for older DX11 games that AMD has "heavy" driver.
But once you get reputation it's hard to get rid of it. Same with driver stability.
1 year ago
Anonymous
He's probably a moron who doesn't realize that the CPU bottleneck only happens at low resolutions like 1080p, which is irrelevant to anyone buying a 4000 series card.
I don't believe you homosexuals saying 12gb vram is too low for 4k.
I bet it will only be so in a few years for the triplest of AAA games. ie cyberpunk dos
I got a 3080 12gb for 590usd when it had a pricing error on EVGAs website.
is that a good price? seems ok to me... considering I upgraded from a 1650 super
Shills and morons
One has to make publicity for a product we dont need
The other are idiots that have the need to justify to a bunch of randoms the money they spent
>have 5700xt >ordered 6800xt, nitro >then i realized i am buying just one generation older card and cancelled the order >7900 is shit, too expensive, sets your house on fire
What do bros? Wait another generation? Isn't buying a new gpu 3 generations apart already too much?
I finally upgraded after 12 years and the biggest difference was going from DDR3 to DDR5 however it's probably because I was running 8gb. Still play the same games btw
built my first pc July 2022 after having my only gaming machine since 2014 be a base PS4. I have a 3060 ti. Got it for MSRP and it's great. It plays everything I want and well some games even do 4K well. plus dlss is a thing and maybe it's cause ive been a console gay this whole time but I don't notice that much of a difference. then again I use my tv so 60 FPS is all I need I guess if I wanted higher framerates I might complain
i actually wanted the 4070 ti but I realized the 192 bit bus makes it kinda not worth it. I'll wait for 50 series. Only problem there is things might get even MORE expensive. In which case i might get a 4070 ti when the 50 series releases for a discount
>192bit bus doesn't matter, it still beats the 3080 in 4K.
The 40 series cards have a lot more cache than 30 series and some people say that's why the 4070 ti can perform while having the 192 bit bus. I'm honestly not knowledgeable enough to know though. What do you think ?
3060 and then, unused from a previous build that my friend fricked up when changing parts (the motherboard/cpu not the card) a 3080. Been too lazy to swap em back out and since I play at 1080, a 3060 and 127k is... fine.
It's not that we want to, we can't.
How can I shit on AAA garbage if I don't pirate it? Then after a couple of hours of suffering I can strengthen my opinion and tell you why it is in fact bad.
Also not all of them are complete trash.
But I don't even bother getting free copy for majority of them.
the idea is that the gpu you buy should be able to play anything. even if you mostly play non AAA stuff on the off chance one comes out that you want to play it would suck to have to buy a new gpu to play it. pretty simple concept im not sure how it went over your head
>Like why do you want to play it so badly?
I recently played KCD with the HD texture pack and that absolutely demands a powerful GPU if you want to play at a decent resolution and framerate. It's also nice to be able to hold a stable 120+ fps in action games and shooters.
for people where money is no object = 4090
for people who want high performance but don't want to spend 1k = 4070ti
for people on a budget = 6800xt or 3060ti
I'm still rocking a 2070 because I got it for like 200 bucks, new, back in 2020
Meanwhile upgrading will now set me back minimum 400 dollars
Fricking ridiculous
Looks like I'll be one of the few who keep their 1070 for another year. There is nothing to upgrade for until Starfield or Bannerlord mods arrive. I game on an older 144hz TN panel, so I would want to upgrade my screen at the same. 1080p is out of question, all panels that are currently being sold suck and 1440 requires a 1000 euro card with 300w power draw. Upgrading is just not worth jt.
>There is no reason to buy 4070Ti >12GB for $800 >There is no reason to buy 4080 >Because 4090 is better value >There is no reason to buy 4090 >Because 4090Ti is just behind a corner > ... >Profit!
someone guy wanted to sell me one for $650 but I turned it down because the power draw is outrageous. even with undervolting it would still be unacceptable for me
How many games are on PC that aren't "console shit" that require even a half-decent rig AND aren't total garbage? Literally can't think of one. GPU's keep getting more and more expensive while all the best games on PC are indie titles that probably run on a Switch or a Steam Deck
>completely misses the point of what I said and state exactly what I had in mind
anon... the only indies that require a decent rig are less than a handful the whole point is this thread is pointless moronic console babbies crying because no longer can they play linear hallways and open big empty sandboxes
>game devs can't game dev efficiently anymore >games are bloated shitware with bloated shit engines >GPU manufacturers have hit a technological wall >Their solution is to brute force with absolutely moronic gimmick """features"""" and charge you more for it >All game devs lick their balls and will happily cater to these moronic bloated gimmick """features""" on top of continuing to shit out inefficient unoptimized shitware
Yeah I'm happy for the coming global depression to reset some mindsets.
Yea but they have no competition at those price points. There are plenty of better GPUs at 500 mark on the used market or even 3070Ti.
1 year ago
Anonymous
except for the part there's no real competition because nvidia owns the market and can do literally anything they want unless people actually buy amd for once 🙂
1 year ago
Anonymous
NV own last gen is the competition for now. So unless they sell all the stock before launching 4060Ti there is no way for such high launch price to be feasible.
1 year ago
Anonymous
and that's exactly what they're expecting to happen
so yeah I hope you actually like indies otherwise you're going to be forced into buying gpus at insane prices because "moore's law is dead" said mr jacket boy
1 year ago
Anonymous
GPUs aren't selling well, so we will see.
I can JustWait™.
was about to buy a 4070ti but it wont fit properly in my case as it interferes with some motherboard wires.
cards are so thick now holy shit
I literally can't upgrade
3080 has a fake MSRP and was pretty much the catalyst for the shit we see today. The real peak for GPUs was the 1080 ti which still performs solidly to this day as Nvidia used its entire die space on raw compute instead of allocating most of it to RT, AI, DLSS and other gimmicks.
its sad to me that so many people dont realize that ultra settings are a huge meme. High or whatever the second best preset is called is almost always enough and gets way better FPS. not to mention you can set it to high then throw in a few ultra settings and depending on the setting you'll get ultra looks for high performance.
Depends what are you playing and what do you expect.
I would say it's enough unless you want to play newest AAA for next 2 years with high framerate and without upscaling.
it probably was better, not gonna lie. the 13 had some serious heat issues, and honestly, i lied, 14 inchs is kino inches, all my systems follow that sizing.
does the section of your keyboard with the letters rty get hot? all my small bois do
>bestbuy out literally of gpus >newegg only selling used crypto gpus from chink sellers
where do you buy cards now? or is everyone just waiting?
I wanted to upgrade for better vr but there's no point
I can't believe nvidia fricked up enough that people unironically convince others to buy Radeon. AMD's name was mud for years and nvidia was basically the only player on the field.
nvidia is living up to their name so it's only fitting
a 6700xt would be nice about now to play some indies and some shit from the past few years I want at high refresh rate 1440p
last time i was there, i waited 2 hours outside in a car while waiting for my "number" to come up, put on my mask, and then got told that someone in line must have the monitor the website said was in stock i still miss it. why do you have to be 6 hours away by car, frick you microcenter
If money is no object and you need a GPU NOW, 4090.
The 4070 Ti generally a better value and, outside of a few 4k benchmarks, is a bit better than or equal to the best of the previous gen (3090 Ti) while being cheaper and more power efficient.
Or split the difference and go for a 4080.
>my 2014 PC has died
Frick forced updates
Recommend me a GPU in terms of price and performance. I’m not into 4K and breaking the limit or whatever it’s called.
upgrading my gpu for the first time bros. (2060rtx to 3080). I'm a little nervous because I'm expecting a cable to be missing or some shit like that but very excited. Never ever have I used a highish end gpu. Running squad at 1440p ultra wide is going to be epic...
The other thing worth keeping an eye on here is that both nVidia and AMD have wafer supply agreements with TSMC. Those were locked in several quarters ago, and both companies will have to cut prices if they’re going to avoid a massive build up of inventory. The chips are getting manufactured regardless of end user demand.
AMD is in a marginally better position in this regard simply because of the breadth of their portfolio at this point - if consumer desktop isn’t selling, they can use those wafers for Epyc, or laptop chips, or even GPUs or FPGA’s.
There is only a certain number of people who will pay $1600 for a graphics card. You will eventually reach a point (pretty quickly) where everyone who can justify buying a 4090 at the current price will already have one. At that point excess stock simply will not sell. Price has to come down.
>There is only a certain number of people who will pay $1600 for a graphics card. >tfw no comfy six figure wfh web dev job where i get paid to shitpost on Ganker and play runescape all day.
Sometimes I think about upgrading, but I've never had a real reason to since I mainly play old games, indie games and emulate.
If I ever do upgrade it'll likely be a 20xx card. Maybe a 30xx card but I'd never need that much power tbh.
so explain to a illiterate techgay like me, is 1 3080 10gb and a 3080 12gb gonna have much of a difference? a couple of hundred dollars for a couple of frames seems kind of excessive.
Not him, but it's a pretty big difference. The 12gb 3080 is basically a 3080ti. It's not just the VRAM, it also has more CUDA cores and a bigger bus. There's no MSRP, so if you're lucky you can find one for a good price. I got mine (MSI) for under $800, which is pretty great.
>gpus barely 3yrs old dont even get driver updates
https://wccftech.com/amd-has-not-released-a-new-radeon-driver-for-rdna-2-radeon-rx-6000-gpus-in-2-months/
Nice one amd no drivers sold my 6900xt 2mo ago good timing on my part. Was gonna get a 7900xtx but im over amds issues vr and coders been broken since may last year unfixed in the new cards
They're just releasing these emergency drivers for the 7000 series because of stability issues. But the 6000 and 5000 series will get driver updates eventually, just be patient.
3060 TI and its power is so fricking unnecessary. There's like 2 games,one of which is CP77 and Hogwarts Legacy that I really even need it for. Its in its own rig and this PC that i'm posting from now has a waaaaaaaaay less capable graphics card but it can handle all the games I actually play just fine so the other computer is practically a console
How's a 6700XT or 6600XT?
My old RX 480 is finally dying and not being able to run everything I throw at it smoothly and looking to upgrade. It's basically impossible to get Nvidia at a decent price where I live (UK) so I'll be passing on that
upgraded from a 1080ti i've had for 6 years to a 4070ti
Tech gays like you might cry because "MUHHH WAIT" ""MUUHHHH AMD"
but the majority is that people don't give a frick about the number of bus. Game runs at 4K60fps and that's all I ask for my card
Now frick off and go back to your mom basement
You sound like the same people who bought a 3080 10gb and was like "DURR WHO CARES IF IT HAS LESS VRAM, LESS CORES, AND IS A GIMPED CARD I PAID $1000 TO PLAY AT 1440P-4K"
we warned you
its better than most poorgays on here why are you sad
you have a poorgay hobby if you can have top of the line equipment for a couple thousand dollars. have sex
the steam hardware stats page is a serious consolegay filter
so what exactly is the problem with the 4070ti? isnt the same as a 3080 or something?
id say its more like a 3090 non Ti with way less power usage. The "3 times better than 3090 Ti" thing is bullshit lifted up by Frame Generation which is a nice feature but has issues . maybe with time it will get better
okay, so I googled around, and apparently the "issue" with this GPU is that there was supposes to be two 4080s one with 16gb of vram and the other with 12gb.
So people are mad the 4080 12gb got renamed to 4070ti.
what a fricking waste of time.
>okay, so I googled around, and apparently the "issue" with this G
it wasnt just VRAM. the chip was also lesser. The 4080 16GB uses AD103, 4080 12GB uses AD104. Lower number is better (3090/Ti and 4090 use 102). 4080 12GB/4070 Ti also has slower bandwidth and way less cuda cores. it is literally not a 80 class card in any way but nvidia tried to sell it as one. its not just the vram
so what is the problem it got renamed to a lower tier like it should.
>so what is the problem it got renamed to a lower tier like it should.
people were mad they were calling it an 80 and it was $900 USD. now they're mad about the price (and still the gimped bandwidth). if you watch youtube benchmark videos its a decent card even at 4K but it should cost like $600
I looked at reviews and at 1440p it matches even a 3090ti sometimes.
this is like saying we want a brand new 3090ti for $600., I mean I get you want cheap GPUs, I do too but dude, this is being unreasonable, it would cannibalize any sales of previous 3000 series gpus still in stock.
>this is like saying we want a brand new 3090ti for $600.
precisely
> it would cannibalize any sales of previous 3000 series gpus still in stock.
good
also i wonder how the 192-bit bus will hold up in the future. some say the increased cache make the slow bus matter less but idk
>I mean I get you want cheap GPUs, I do too but dude, this is being unreasonable
xx70 cards were the $350 card actually the 970 was the $329msrp card. even inflation doesnt account for the greed NV shill. keep eating their shit and smiling im sure its good for you
It has terrible price per performance and is too expensive for mid tier gaming and too low of performance for high tier gaming.
don't buy it then? what are you stupid or something.
I'm not. Why are you asshurt that I'm pointing out that it's shit?
>terrible price to performance
I don't understand why people keep posting this, and i don't even have a 4070 ti
Because it's true homosexual.
Here is why, 3080 came out two years ago as nearly top tier GPU for $699 now we get 20% faster GPU for $799.
4070Ti is not even half of 4090.
>but crypto
>muh inflation
WHO CARES.
No progress was made other than lower power draw.
price/performance is starting to become a flawed metric anyway, at some point it doesn't matter how good the price performance is if the nominal value of the gpu is extremely high
we're at the point where new 'midrange' gpus are more than half of the cost of a pc build
>at 4k
>and too low of performance for high tier gaming
homie what the frick are you talking about, I've built multiple 4070TI pcs for my job and they ultra games in the 100's with raytracing on at 1440p
Some Ganker tards are smoking crack. Don't expect sanity.
>$799 GPU
>1440p
are you moronic
>too low of performance for high tier gaming.
What the frick is high tier gaming? What can't you run with this?
that anon is just moronic
>mid tier gaming
What is this moronic shit man? Just because it has a 70 at the end doesn't mean it's mid tier, especially when it's trading blows with much higher numbered cards of the previous gen at 1440p
the only real problem is the name/price but for someone who would have been in the market for a 3080/3080ti/3080 12gb the 4070TI is pretty much a slight upgrade for the same price
basically the 4070 TI makes the 3080 cards a bad deal unless you can get them below msrp
>the 4070 TI makes the 3080 cards a bad deal unless you can get them below msrp
3080 10GBs are like 500 on ebay
personally I don't like buying used hardware, especially not when there's miners trying to offload their gpus
it's not even about being worried about buying a burnt out gpu, I just don't want to give the people who drove up the prices of gpus any money
as long as the gpu works for years its imo better than buying 40 series for 40 series prices.
>isnt the same as a 3080 or something?
It outperforms the 3090 ti in 1080p and 1440p, loses slightly in 4K
It's a rebranded gimped 4080 at a ridiculous price that can be outclassed by the 3090ti and several AMD offerings. Remember the awful cancelled 4080 12GB edition? The 4070TI is the EXACT SAME card released under a different name to try to skirt the outrage.
192 memory bus instead of 256 like the 3070Ti and it's hurts in 4k
Nvidia gimped it with just 192 bit bus and just 12gb of vram.
>say something positive but post le negative picture
>le gigachad meme template
troony
>consoles have 16GB VRAM
midrange(lol) bros, not like this
PS5 has 8GB RAM spread accross VRAM and system memory. Besides I don't really understand why people see it as a big deal to not be able to run 4k ultra textures in 0.1% of the games.
>PS5 has 8GB RAM spread accross VRAM and system memory
that can't be right
Double check things before you post.
PS4 had 8GB, PS5 has 16. You are correct that its shared, though
No it doesn't, PS5 has access to 13GB's and it is shared.
>a PC has no OS overhead
shared is the important part, that's not 13GB's VRAM, not even close.
Thats not the point I was making. You're not wrong about the shared part
What difference does that make? A PC's OS doesn't eat into vram unless it uses some shitty APU.
>3060 Ti
nice, got mine last week
>6600 XT
I'm waiting for 7900XT prices to drop by around 200€ in my c**try. No new gaymes worth playing right now anyway.
never forget
970 was good until it suddenly wasn't.
Still serves in PC I gave away to my friend.
The best there ever was.
need one with 4080 12gb and under it says 4060 12gb
>970
I don't care I'm upgrading from a 1050 gtx ti so it's an upgrade through and through.
I went from a 1060 to the 4070 TI and I couldn't be more happy.
At that absolute current rate of GPU progression, this should hold me for a while.
too bad about the price though
It's a sellers market.
Anyone notice that HWInfo hasn ever updated the Nvidia logo?
so? its kino
>1080
>3060Ti
same. its great
It's a good one. I just wish I had a better CPU but can't be arsed to upgrade
I'm the same. Upgrades my 970 to the 3060ti after having my 970 since like 2014
Now my CPU, i5-8600 is the bottleneck and needs replacing. Problem is every time I upgrade my CPU I need a new mobo, so this shit is expensive
my motherboard cost more than my i5 13600k
what the shit what motherboard ?
ASUS ROG Strix Z690-G DDR5
>ROG Strix
all i know about this naming scheme is that its always Asus's most expensive shit
it's their GAMER brand with MILITARY GRADE HEATSINKS but the only reason I got it was because there's few matx ddr5 boards
Still the champ for 1080p gaming.
i use it at 1440p
the duality of man
the homura poster is moronic 3060 ti is great
>3060 ti
Bought mine at launch for MSRP, sold it for profit before crypro crash. It also dug 200€ worth of meme-coins. Feels good.
>3070
>GTX 1650
I'm upgrading from a 980 to a 3070, what should I do with my older card now? I doubt anyone would want a card that's been regularly used for seven years.
>sell it
>or put it in a backup build
>make a backup build that also doubles as a media server or HTPC
>if you do make a backup build, use the 980 to test recent games and make youtube videos so poorgays can see how low the FPS will be on the latest AAA goyslop on their poorgay rigs
>keep it in storage just in case something ever happens to your 3070
>fell for the sli meme
>upgrade from 1080 to 4070ti
>constant 30fps 1% lows
>some games flat out run worse somehow like Hoi4 and Holdfast
Let me guess you still have your old CPU and RAM?
You only notice those drops now because they are of bigger magnitude.
i still have my old i9 9900k hanging in there with a 4070ti, it's still performing like a champ, definitely it's last stand though
>old i9 9900k
shit weird to think of it like that. i'm holding onto mine until it dies.
>old i9 9900k
practically ancient egyptian over here with my 2600k
>tfw 3770 equivalent Xeon here
frick, are you serious? i'm about to upgrade from a 1080 to a 4070ti... i have a 5600 cpu. how fricked am I?
and here I thought I was bad with pc logistics
watch some youtube videos at least
5600 can go up to 3070ti at the very max before bottlenecks
Don't listen to Ganker tards do your own research.
CPU Bottleneck. GPU doesn't just magically give you more frames.
>Ti
we all know nvidia is going to wienerblock their 3xxx users just like they wienerblock their older gen from new features so better pony up for the 4xxx series like a good goy
what new features? raytracing doesn't work below 4090 and dlss is a gay meme.
>what new features?
raytracing
dlss 3 frame generation
they may be meme features, but they are still features. not worth the money imo but features nonetheless
yeah miss me with that
yeah dlss 3 is garbo but eventually they'll fix it and wienerblock the 3xxx series
dlss is fundamentally garbage i have no idea why it's developed or promoted
t. 1060 owner
weird flex, dlss is aimed at poorgays.
It depends on graphics settings and the game.
You aren't playing Cyberpunk at native 4k with ray tracing on at consistent 60 fps even with a 4090.
so a cope. a blurry shit cope. dlss only seems nice in marketing, trying it out i was disgusted.
t. 1080p player
Looks good on my 4k monitor.
wrong, but you do you. dlss looks like shit to me.
dlss is fine and looks fine. the problem is devs using it (and fsr) as a crutch. games should run well as intended without those two being necessary
I will still disagree. I only tested it on Control and Cyberpunk, on a 4k screen, and I still deferred to either lower settings or resolution because it just felt wrong.
that's funny because a 4090 cannot even keep a consistent 60 fps in dead space without dlss
No dipshit. Its so you can get 60fps on your $1600 gpu when you turn on RT
dlss 3.0 is great because it can beat cpu bottlenecks
dlss 2 is good. whats your issue with it
it improves frames for minimal image quality loss
>minimal image quality loss
I see it and don't like it. Call it placebo or looking for issues but I don't like it. If you don't mind whatever.
Here you go, anon, which half is native 4K and which half is DLSS Performance (internal 1080p).
This should be really easy since there are a few specific details here that DLSS handles very poorly.
not that anon but wouldn't it make more sense to have the same part of the image on both sides ?
It should be easy either way.
now you switches the sides, native is right, upscaled is left
Correct
NTA but the right is obviously the upscaled one.
>a picture of downscaled render
good going moron shill make more of these while dlss still looks like shit in motion to anyone with a pair of eyes
>Downscaled render
Anon...
oh im sorry it's a downscaled high settings image that's upscaled to your resolution. how is this desirable? and i don't care for your answer because if it works for you fine. i don't like it.
I think it looks good in motion. There are certain artifacting problems in 4k native that DLSS solves (though introducing blurring on movement of certain objects.) The yellow curb dots make it obvious which one is up-scaled, but are horrendously unstable in motion at native 4k, but not with DLSS on.
I think ray tracing looks great in Cyberpunk, and would rather have that on and 70 fps versus having it on at native 4K and getting a little less than 30.
>the graphics are akshuly unstable in native
You sure your brain hasn't just gotten used to the glitches caused by the DLSS?..
All graphics engines feature some kind of artifacting in their rendering.
Unstable thin lines and other problems related to aliasing are a common one and DLSS (or its non-down scaling implementation DLAA) resolve that much better.
In some games I would rather use DLSS at quality (internal 1440p) even if I don't need the performance boost because TAA is such shit.
>All graphics engines feature some kind of artifacting in their rendering.
holy fricking cope
DLSS is good if you use it with DSR because then it's like good anti-aliasing which is something that DX12 games don't have. But yes anyone who uses DLSS on native resolution and then says "lol no difference" is fricking blind and wasted their money on a monitor where they cannot tell the difference anyway.
3080
>1080 Ti
>crying about a 4070 Ti
b***h I have an RTX 3060 and I never even fully utilize it. Most good games are made by indie studios now and honestly I regret upgrading from my GTX 1060.
>3060 Ti
its an amazing card what your deal?
t. has a 3060 ti ftw3 ultra
Look at the steam surveys. Even my 3060ti is a rare richgay card according to that
where my 2070 Super bros at?
still waiting for a game that needs me to upgrade
I'm here anon
lads, give it to me straight
I have a MSI 1070 8GB right now and I want to upgrade to something that's better suited for AI generated cooms What should I get?
3060. About as fast as a 1080 Ti while having 12GB VRAM.
The 3060 Ti and 3070 are even faster, but they only have 8GB.
so what's the difference between 3060 and ti when it comes to ai stuff as well as blender?
yes 3060 has more ram but ti is faster
who comes out on top? does 8gb just bottleneck ti ?
All the benchmarks for games at least show the extra speed of the ti gives better performance than the extra VRAM
Maybe if you're playing skyrim and running 8k texture mods the VRAM is better
I can't speak for Blender, but for SD 3060 allows you to train models more easily and prompt higher-res images, while the 3060 Ti prompts faster
Though I hear with LoRA you can train with 6GB, you're better off asking the folks at Ganker for more info
Check Ganker and stable diffusion breads.
>RTX 4080
i dont know why you guys keep saying the 4070 ti sucks for 4K when youtube videos show it runs fine unless you turn on memetracing and even then it performs about a 3090 level. the only question is how long it will be able to do that as games get more demanding. for current games it seems fine at 4K
Because you don't fricking change your GPU every gen unless you are moronic or have to much money. 12GB is insult on $800 GPU you will struggle with that VRAM to run games at 1440p soon. RT will become norm if you like it or not. People cry about it but this is price of progress.
>just lower settings
>800 fricking dollars
/rant
Have a nice day.
>Because you don't fricking change your GPU every gen unless you are moronic or have to much money.
im not suggesting anyone do that. plus not everyone has a 30 series or RX6000 GPU.
>you will struggle with that VRAM to run games at 1440p soon.
doubt
>RT will become norm if you like it or not.
turn it off. boom problem solved
>doubt
You underestimate the incompetence of devs. When 3070 came out people were saying the same, one year later you had already VRAM issues with D00M E at 1440p or Resident Evil. It wasn't severe to the point of unplayable but we are getting there.
>turn it off. boom problem solved
Games will look worse that on PS3 without it. See above, lazy devs.
Flopspoken is the best example, game looks like shit and still requires high end hardware.
But then again there is like one or two AAA games worth playing each year.
>Games will look worse that on PS3 without it.
doubt. raytracing isnt the problem in forspoken . if it was turning it off would improve performance. the game is just poorly made overall
I didn't make myself clear. Once RT becomes more mainstream there won't be nearly as many backup solutions implemented in games. Things like SSAO, SSR, pre-baked shadows, probe lights etc.
RT just takes less time to implement than traditional methods.
UE5 uses different RT implementation than straight RTX so we will see how it goes, because it will become go-to engine soon.
might be true, but they will have the problem on their hands
since the most popular and average card in a year is going to be 3060 ti
can they afford to lose such a massive percentage of potentional buyers?
we'll see
As long as it will run on PS5, SeX they won't hold back.
SeS is already struggling with RAM capacity tho, this is why DeadSpace Remake doesn't have RT on that console.
>since the most popular and average card in a year is going to be 3060 ti
It won't, so far 1650S is the most common, according to gaben but it accounts both desktop and mobile. I would rather say 3060.
the lighting in forspoken looks so hilariously bad, you'd think it'd be because they had dynamic lighting but nope, it's all baked in, just baked in wrong
there's not even a day night cycle
it's no wonder UE5 is picking up steam because some devs just can't get shit done with in house engines
that's what happens when you let amd sponsor a game and force the devs to include all their meme "features" instead of nvidia
not gonna lie amd sponsored titles are always clusterfricks and I'm surprised they don't get called out for it more
didn't they sponsor callisto protocol? kek
yea they did
they sponsor borderlands as well and those games are so bizarrely heavy I just don't understand it, and it's not like the games are pretty or the worlds are very interactive either
I'm trying to get a 3000 card just so I get 60 fps again in video games.
Stop playing goyslop
no
No.
is he right ?
Yes.
We need games that push tech and what is possible.
How about games just be good? I would be satisfied with that
We can't have those, sorry.
I wasn't talking only about graphics, can we get good destruction or interaction with environment? Or fun dialogue system with not so rigid respones? Better AI maybe?
no because there's too many dumb and cheap morons on PC now
>nooo I don't want to buy an nvme SSD for $100, I want to use my 7200 rpm HDD from 8 years ago
>nooo I don't want to upgrade to 32gb ram, I'm still using 8gb of ddr3 ram
>nooo I don't want to upgrade to modern 6 or 8 core cpu because I've been memed into thinking the i5 2500k is the greatest cpu of all time and anything it can't handle is SHIT
those people are worse than consolegays, at least consolegays
I mean for fricks sake, you can get a solid pc minus the overpriced gpu like $600-700 and then slap in an old gpu until prices come down but people can't save up $700 to upgrade from their old dog shit
frick I accidentally hit submit
I was going to say at least consolegays are forced to use fast SSDs and more RAM with new consoles
I'd love for one decent sized AA studio to make a PC game that only runs on modern PC hardware and takes advantage of it to do interesting stuff with enemy AI and physics.
>1070
>3060
Same. I also spent 1000 bucks on it during the covid price hike
Damn son you went full moron, I'd rather have used integrated graphics than spend 1000 bucks on that fricking turd.
Ah well, you live and you learn
This homie really spent 1000 bucks on a GPU that is outperformed tremendously by a 7 year old GPU kek. Why didn't you just wait, there was no way those prices were sustainable.
My new PC parts were just sitting there, waiting to be put together... Everything else was sold out.
Ah I get it, it was a pretty shit time to be building a PC. Hope you at least used stimulus money or something.
Compared to the other things in my life that I spend money on, a $400 price difference is not much
>2060 super
based, id still be on it if i didn't get my 3060ti for free
>if i didn't get my 3060ti for free
whose dick did you suck
>2060 Super
same. It's not bad but it feels like it's on the verge of falling behind high settings in new games without limping along at 60 fps.
Pointless card, it can't do 4k, might as well buy a 3060 TI for half the price and stick to 1080p
> it can't do 4k
yes it can
It does like 20 FPS without DLSS in Cyberpunk 2077 at 4k
>but it's an unoptimized game
Still proves that the card is not future proof enough.
>Still proves that the card is not future proof enough.
No one made that claim tho? Matter of fact no X70 card has ever made that claim
>It does like 20 FPS without DLSS in Cyberpunk 2077 at 4k
liar
?t=225
Not him but
>DLSS
>4k
Pick one.
DLSS is off moron
>That isn't exactly showing the 4070 ti to be a good 4k card considering my 6800 xt runs the game at the same exact FPS.
the anon im replying to said the 4070 Ti gets like 20 FPS in cyberpunk 4K without DLSS. he's wrong
That isn't exactly showing the 4070 ti to be a good 4k card considering my 6800 xt runs the game at the same exact FPS.
DLSS is off there
So is RTX.
And anyone knows only RTX is causing issues in cyber troony at 4k.
We don't even have overdrive mode patch yet.
Ultra settings are meme tho.
4080
4090
7900XT
7900XTX
They all bottleneck when paired with a 12900k.
>Blanket bait statement
It's true, though.
It's not.
Without telling what gayme, settings etc it's just worthless statement.
According to Gamer Nexus, it is. Their 12900K test bench bottlenecks with those four cards.
You either have not idea what that means or you are baiting me.
CPU bottleneck would mean all those GPUs would have near identical performance I skip over vendor specific driver overhead for certain scenario, because CPU will heavily limit their performance. So for example you can run CP77 with settings where CPU will be limiting performance but even with 4090 you will be GPU limited on 12900k if you run that game at 4k native with RT on. So like I said depends on game and settings.
You can watch their review for the 7900 cards and hear it directly from the man himself. Or you can sperg out on Ganker.
I think you'll choose the latter, you seem autistic.
Why not give timestamped link to review then?
Because I don't remember exactly where it is, and I don't particularly feel like looking for it. So you can just believe whatever you want to believe.
You are just making shit up, is what I believe in.
Random 4k chart from 7900XTX GN review proves that.
Bottlenecking happens at 1080p, but okay bro.
Slowly getting what I was saying all along?
I didn't bother to read your autistic sperging out.
Cool, now go back to watching tictock.
I guess telling people to go back to preddit got old.
Nvidia suffers more because they have a lot more driver overhead currently
Too bad only HU tests those edge case scenarios. We could find some other strange behaviors.
Its strange because it was always the other way around and people still shit on AMDs drivers despite them being pretty good the last couple of years at least.
Yea, it's still is the case for older DX11 games that AMD has "heavy" driver.
But once you get reputation it's hard to get rid of it. Same with driver stability.
He's probably a moron who doesn't realize that the CPU bottleneck only happens at low resolutions like 1080p, which is irrelevant to anyone buying a 4000 series card.
>GTX 750 TI
>1080ti
I don't believe you homosexuals saying 12gb vram is too low for 4k.
I bet it will only be so in a few years for the triplest of AAA games. ie cyberpunk dos
I almost wouldn't care about new cards if not for Capcom. There literally are no AAA studios left.
Thanks for your vote on Nvidia's current business model. Enjoy your video games
Got a 4070TI for the same price as a 3070TI.
I'm content with my purchase
how?
Newegg.
I got a 3080 12gb for 590usd when it had a pricing error on EVGAs website.
is that a good price? seems ok to me... considering I upgraded from a 1650 super
Great deal. 12GB 3080 is almost 3080Ti in terms of performance.
Enjoy them games.
Still waiting for the RTX5050Ti in two or three years. My last NVIDIA card was a GTX 550Ti, so I wanna see how that extra zero would perform.
>thinking nvidia will ever make affordable low/mid tier GPUs ever again
3060 Ti or 3070 Ganker? Which one do I get
Either way you're card is gonna be useless in a year so get the cheaper one.
why are pcgays so entitled?
Shills and morons
One has to make publicity for a product we dont need
The other are idiots that have the need to justify to a bunch of randoms the money they spent
>4070 ti aorus
>have 5700xt
>ordered 6800xt, nitro
>then i realized i am buying just one generation older card and cancelled the order
>7900 is shit, too expensive, sets your house on fire
What do bros? Wait another generation? Isn't buying a new gpu 3 generations apart already too much?
new thing is always going to be more expensive than old thing when it comes to upgrades
I think they might be pushing for huge PR campaign how they lowered the price and are your friends so masses buy.
> Isn't buying a new gpu 3 generations apart already too much?
CONSOOMer detected
5700xt is still solid. Don't reward them for their extortionate prices
5700XT. Will probably upgrade next year. Not a lot of games I need more power for tbh.
Based Navichads.
I finally upgraded after 12 years and the biggest difference was going from DDR3 to DDR5 however it's probably because I was running 8gb. Still play the same games btw
How does it feel to finally be a gamer?
feels bad man video games suck
>buy gpu
>it comes with a trading card
what in the frick
considering the games Gankertards are allowed to like, they could get any 30 series card and never have to upgrade again.
>still on an i5 2500k
>thfw I realiize my next cpu build will in all likelyhood last me 20 years
No it wont. That would be like running a single core Pentium 4 today
>3090
>3060ti
>RTX 2080
>Exclusively play indie and RPGM games
>GTX 980
I mean, considering I mostly play indies I can't really bother changing it until it shits the bed.
>8800gtx
>integrated graphics
>year 2023 and 1440p cards cost $800
Grim
And you'll keep voting for the Federal Reserve Uniparty that led to this state of affairs while feeling smug.
>RX 6650 XT
it'll do
ever notice how pc gays spend more time worrying about specs and settings vs the games themselves?
i can do both, can you?
>1050Ti
>3070
>I5-9600KF
built my first pc July 2022 after having my only gaming machine since 2014 be a base PS4. I have a 3060 ti. Got it for MSRP and it's great. It plays everything I want and well some games even do 4K well. plus dlss is a thing and maybe it's cause ive been a console gay this whole time but I don't notice that much of a difference. then again I use my tv so 60 FPS is all I need I guess if I wanted higher framerates I might complain
>60 fps
you're missing out
am I?
1440p 144hz > 4k 60hz
Amen.
>RTX 2060
i actually wanted the 4070 ti but I realized the 192 bit bus makes it kinda not worth it. I'll wait for 50 series. Only problem there is things might get even MORE expensive. In which case i might get a 4070 ti when the 50 series releases for a discount
192bit bus doesn't matter, it still beats the 3080 in 4K.
>192bit bus doesn't matter, it still beats the 3080 in 4K.
The 40 series cards have a lot more cache than 30 series and some people say that's why the 4070 ti can perform while having the 192 bit bus. I'm honestly not knowledgeable enough to know though. What do you think ?
3060 and then, unused from a previous build that my friend fricked up when changing parts (the motherboard/cpu not the card) a 3080. Been too lazy to swap em back out and since I play at 1080, a 3060 and 127k is... fine.
thoughts on intel arc?
I don't get these threads
>NOOOO MY GPU CAN'T PLAY A DOGSHIT AAA GAME NOOOOOOOOOOO!!!!!!
Like why do you want to play it so badly?
It's not that we want to, we can't.
How can I shit on AAA garbage if I don't pirate it? Then after a couple of hours of suffering I can strengthen my opinion and tell you why it is in fact bad.
Also not all of them are complete trash.
But I don't even bother getting free copy for majority of them.
the idea is that the gpu you buy should be able to play anything. even if you mostly play non AAA stuff on the off chance one comes out that you want to play it would suck to have to buy a new gpu to play it. pretty simple concept im not sure how it went over your head
Why cant devs make optimized games?
Because people buy them anyway.
>Like why do you want to play it so badly?
I recently played KCD with the HD texture pack and that absolutely demands a powerful GPU if you want to play at a decent resolution and framerate. It's also nice to be able to hold a stable 120+ fps in action games and shooters.
what are even the best cards at the moment?
i've been on a 1080ti for 6 years
for people where money is no object = 4090
for people who want high performance but don't want to spend 1k = 4070ti
for people on a budget = 6800xt or 3060ti
probably going 4070ti
4090 seems overkill and will feel dated/expensive later
I'm still rocking a 2070 because I got it for like 200 bucks, new, back in 2020
Meanwhile upgrading will now set me back minimum 400 dollars
Fricking ridiculous
>7 year old build made for 1080p gaming.
>still don't feel the need to upgrade
Modern games are trash and 4k is a meme
>5700 xt
if the 4070 ti were $500 id buy one immediately
Looks like I'll be one of the few who keep their 1070 for another year. There is nothing to upgrade for until Starfield or Bannerlord mods arrive. I game on an older 144hz TN panel, so I would want to upgrade my screen at the same. 1080p is out of question, all panels that are currently being sold suck and 1440 requires a 1000 euro card with 300w power draw. Upgrading is just not worth jt.
I am playing elden ring on highest settings with a 970 and nobody can stop me
the way I see it there is no reason to buy a 4080. either 4070 ti or 4090
>There is no reason to buy 4070Ti
>12GB for $800
>There is no reason to buy 4080
>Because 4090 is better value
>There is no reason to buy 4090
>Because 4090Ti is just behind a corner
> ...
>Profit!
>6700xt
Where the 3090ti chads are?
someone guy wanted to sell me one for $650 but I turned it down because the power draw is outrageous. even with undervolting it would still be unacceptable for me
>geforce gtx 1070
Remember to repaste and change thermal pads on used mining GPUs.
thats toothpaste
Modern GPU pricing has put PC behind consoles.
if you play console shit it is
but in that case why the frick aren't you buying a console moron?
>but muh fps
go have a nice day
How many games are on PC that aren't "console shit" that require even a half-decent rig AND aren't total garbage? Literally can't think of one. GPU's keep getting more and more expensive while all the best games on PC are indie titles that probably run on a Switch or a Steam Deck
>completely misses the point of what I said and state exactly what I had in mind
anon... the only indies that require a decent rig are less than a handful the whole point is this thread is pointless moronic console babbies crying because no longer can they play linear hallways and open big empty sandboxes
they hated him because he told the truth
>1650
remember games?
>
>game devs can't game dev efficiently anymore
>games are bloated shitware with bloated shit engines
>GPU manufacturers have hit a technological wall
>Their solution is to brute force with absolutely moronic gimmick """features"""" and charge you more for it
>All game devs lick their balls and will happily cater to these moronic bloated gimmick """features""" on top of continuing to shit out inefficient unoptimized shitware
Yeah I'm happy for the coming global depression to reset some mindsets.
you fell for the beta test
https://videocardz.com/newz/nvidia-geforce-rtx-4070-non-ti-rumored-to-have-2475-mhz-gpu-boost-clock
If it will be $500 it will sadly still sell.
Keep dreaming. Thats going to be the 4060ti price
That would be moronic even for current market.
so just like the 900$ 70 ti and the 1200$ 4080
Yea but they have no competition at those price points. There are plenty of better GPUs at 500 mark on the used market or even 3070Ti.
except for the part there's no real competition because nvidia owns the market and can do literally anything they want unless people actually buy amd for once 🙂
NV own last gen is the competition for now. So unless they sell all the stock before launching 4060Ti there is no way for such high launch price to be feasible.
and that's exactly what they're expecting to happen
so yeah I hope you actually like indies otherwise you're going to be forced into buying gpus at insane prices because "moore's law is dead" said mr jacket boy
GPUs aren't selling well, so we will see.
I can JustWait™.
was about to buy a 4070ti but it wont fit properly in my case as it interferes with some motherboard wires.
cards are so thick now holy shit
I literally can't upgrade
Yeah, its getting really stupid and I refuse to participate
>1660TI
>No 3080 in the market on my country right now
>3090 is hella expensive
I will have to buy a 4070 TI eventually
remember when people unironically thought intel gpus would matter? kek
Intel target audience is third worlders
I still hope they won't abandon GPU market even if they can only compete at low end.
Because right now their GPU suck.
trust the plan
what plan
They'll probably eventually be fine, Novidea's drivers used to be complete dogshit up until like 2012 too
>6800 XT
>399 new shopper special at Microcenter
star!!!
>3080 was the peak of GPU and it all deteriorates from here
bros...
3080 has a fake MSRP and was pretty much the catalyst for the shit we see today. The real peak for GPUs was the 1080 ti which still performs solidly to this day as Nvidia used its entire die space on raw compute instead of allocating most of it to RT, AI, DLSS and other gimmicks.
So what's the best value GPU today then?
This
by far. 3080 performance at a fraction of the price.
>Value
Depends what you need.
Also depends how much $ you have to spend.
If you don't want to use Radeon, 3060ti. Otherwise whatever Radeon is offering around 350-400.
3060 Ti
congrats its an amazing little gpu
bros I can't stop myself im gonna CONSOOM
>1660 soon to be 3050
I hope it's free because it's not much of a upgrade.
>tfw my 970 died finally died last week and I made a new build with a 3060 Ti
Goodbye old friend
>this thread
>consolebaby starved of attention
here's a you, now frick off
would honestly rather be the guy on the right
at least he has standards
the one on the left will eat the goyslop and enjoy it
At least you didn't pay scalper prices like I did for a 3070Ti, I needed a new card for Elden Ring I was running a 960 at the time.
>vega 56
wouldn't upgrade even if i had the money since i don't need anything more
Do you only play older games?
it still runs whatever i tried on low/medium
Fair enough. I kept my 970 until recently and it was still playing whatever I threw at it, but I was mostly sticking to indies.
is jawa legit ? anyone know ?
>equals the 3090ti for way cheaper
whats the problem?
>3090 Ti.
I'd say more like a 3090
its sad to me that so many people dont realize that ultra settings are a huge meme. High or whatever the second best preset is called is almost always enough and gets way better FPS. not to mention you can set it to high then throw in a few ultra settings and depending on the setting you'll get ultra looks for high performance.
>preset: high
>view distance: max
>vsync: on
>turn shit down from there (except view distance) until I get 60
simple as
>vsync: on
why is vsync bad?
he enjoys horrific screen tear
>turn shit down from there (except view distance) until I get 60
60 FPS should be the floor not the goal
Who fricking cares, I have a quadro m4000, don't even feel the need to upgrade lol.
>1660TI
you need more ?
do you even need more than 3060ti if youre playing on 1440p?
Depends what are you playing and what do you expect.
I would say it's enough unless you want to play newest AAA for next 2 years with high framerate and without upscaling.
>want to play newest AAA
literally who
Check all the shilling threads.
Some Anons will take the bait.
i dont think chatgpt plays ga...actually...a.i probably could play games, better then...huh...
Hmm, are you having a stroke GPTkun?
come forwards and center to my general position, kinship. i bet your silver.
for me it's a gaming laptop
I got Razer Blade 15 with 3070 and it can play any game without trouble. I don't get the hate for gaming laptop.
>15
homosexual. the 13 looked aight. lesser hardware, sure, but 13 inches is kino inches
I fell for the bigger number = better meme
it probably was better, not gonna lie. the 13 had some serious heat issues, and honestly, i lied, 14 inchs is kino inches, all my systems follow that sizing.
does the section of your keyboard with the letters rty get hot? all my small bois do
Surprisingly it only made jet engine noises when playing RDR2 or Elden Ring. Also my keyboards don't get hot. Just the back.
the g14 and x13y gets fire hot in those spots. luckily your using r mostly for reloads, but, man.
and the g14 was all jet engines, all the time
great. time to play some 14 year old games while shitposting here
should i sli my gtx 1080
i dont know if its a good idea im moronic
sli was shit for gaming
what was it for? i wanted to do some renders in old 3d programs
>what was it for?
tricking you into buying a second gpu
>i wanted to do some renders in old 3d programs
you might be better off with just having the 2 cards running independently
i only buy msi
>GTX 1650
I can emulate, play vidya, shitpost and generate AI images thanks to this thing, god bless this toaster card, I'll use it until it explodes.
if you have a 1070 or anything above it/anything newer you literally don't need to upgrade and won't need to for at least 4 years
Just upgraded from a 1070 to a 3070
>bestbuy out literally of gpus
>newegg only selling used crypto gpus from chink sellers
where do you buy cards now? or is everyone just waiting?
I wanted to upgrade for better vr but there's no point
Micro Center.
prebuilts. unironically.
I can't believe nvidia fricked up enough that people unironically convince others to buy Radeon. AMD's name was mud for years and nvidia was basically the only player on the field.
nvidia is living up to their name so it's only fitting
a 6700xt would be nice about now to play some indies and some shit from the past few years I want at high refresh rate 1440p
>$799
>192-bit memory bus
What the frick were they thinking???
>1660 Ti
>5800X3D
it's actually breddy guude and i hate AAA shit anyways but elden ring actually ran well
>Mid range video cards are now 1300 leaf bucks and more
Good fricking lord. I really miss affordable video cards.
>Mid range video cards are now 1300 leaf bucks and more
4070 ti = $799
Welcome to Canada
>4070
>midrange
are you stupid? no game at medium/high needs a 4070
midrange these days would be 3000 cards and lookie here (CAD btw)
>no game at medium/high needs a 4070
Some people like to play above 60fps.
"midrange" so you need to pick, if you want everything then say so
1440p is midrange. 4k is high end. 1080p is poorgay-tier.
please provide a link to this supposed 4070 ti that costs $799 CAD
dude just live in america. microcenter and best buy has tons of emem for that msrp
last time i was there, i waited 2 hours outside in a car while waiting for my "number" to come up, put on my mask, and then got told that someone in line must have the monitor the website said was in stock
i still miss it. why do you have to be 6 hours away by car, frick you microcenter
i don't care about your gay story, just know that they're in stock here for msrp
motherfricker
https://www.microcenter.com/product/662231/zotac-nvidia-geforce-rtx-4070-ti-trinity-triple-fan-12gb-gddr6x-pcie-40-graphics-card
still waiting
>recently upgraded to 1660 Super OC
>970 laptop
The forspoken demo almost made it blow up ive never heard the fans spin that loud holy shit
come home white man
theres nothing white about ebay
wasn't it made by a white guy
so was like, everything. your point?
Um israelites are white, goyim!
unironically, probably
What's a good amd equivalent?
amd doesn't have any good products. sorry
>1234
hows it treating ya?
>doesn't have the 4321 teai
poorgay
>RX 480
>nintendo switch
>GT 1030
Upgraded to a 3070Ti and built a hand-me-down PC to my bro with a 1080Ti
Everything runs good at 1440p MAX settings with my i9
>Radeon HD 7950
I want to buy a new gpu. Would it be moronic to get a 4090, or should I be more sensible and bump it down a bit like a 4080 or 30xx?
Get the one that doesn't start house fires.
wait which one starts house fires
all of them
your money, we dont give a frick
if you weren't moronic you wouldn't be considering a 4090
so buy a 4090
If money is no object and you need a GPU NOW, 4090.
The 4070 Ti generally a better value and, outside of a few 4k benchmarks, is a bit better than or equal to the best of the previous gen (3090 Ti) while being cheaper and more power efficient.
Or split the difference and go for a 4080.
IF you have an actual budget just wait.
>Intel UHD 630
Will prices come down for the 50 series bros? 🙁
>my 2014 PC has died
Frick forced updates
Recommend me a GPU in terms of price and performance. I’m not into 4K and breaking the limit or whatever it’s called.
Why are you crying? You need more?
yes
he tried to loot one?
>8K marketing
>12gb 3080
>1440p 165hz monitor
Video games are fun.
It's like 3 times more powerful than a PS5, don't sweat it.
I upgraded from a 970 to a 2060 Super and it suits all of my needs, even VR for the most part.
choose your fighter
It's crazy how this is still accurate and this was made before Arc.
Some things never change.
>4080
Sorry for being part of the problem guys, but this card is great. It's so cool and quiet.
So what's the sweetspot then
1080p - 3060ti
1440p - 3080 12gb or 4070ti depending on price
4K - 4080 or 4090 depending on how much money you want to burn
upgrading my gpu for the first time bros. (2060rtx to 3080). I'm a little nervous because I'm expecting a cable to be missing or some shit like that but very excited. Never ever have I used a highish end gpu. Running squad at 1440p ultra wide is going to be epic...
congrats anon, welcome to the party
The other thing worth keeping an eye on here is that both nVidia and AMD have wafer supply agreements with TSMC. Those were locked in several quarters ago, and both companies will have to cut prices if they’re going to avoid a massive build up of inventory. The chips are getting manufactured regardless of end user demand.
AMD is in a marginally better position in this regard simply because of the breadth of their portfolio at this point - if consumer desktop isn’t selling, they can use those wafers for Epyc, or laptop chips, or even GPUs or FPGA’s.
There is only a certain number of people who will pay $1600 for a graphics card. You will eventually reach a point (pretty quickly) where everyone who can justify buying a 4090 at the current price will already have one. At that point excess stock simply will not sell. Price has to come down.
>Price has to come down
Soon.
>There is only a certain number of people who will pay $1600 for a graphics card.
>tfw no comfy six figure wfh web dev job where i get paid to shitpost on Ganker and play runescape all day.
Still waiting on 7000x3d chips
>RX 6700
upgraded from a RX 570. Pretty happy with it.
>1060 6gb
Sometimes I think about upgrading, but I've never had a real reason to since I mainly play old games, indie games and emulate.
If I ever do upgrade it'll likely be a 20xx card. Maybe a 30xx card but I'd never need that much power tbh.
think of it like this. 20xx for 1080p, 30xx for 1440p.
>3080 12gb
so explain to a illiterate techgay like me, is 1 3080 10gb and a 3080 12gb gonna have much of a difference? a couple of hundred dollars for a couple of frames seems kind of excessive.
Not him, but it's a pretty big difference. The 12gb 3080 is basically a 3080ti. It's not just the VRAM, it also has more CUDA cores and a bigger bus. There's no MSRP, so if you're lucky you can find one for a good price. I got mine (MSI) for under $800, which is pretty great.
remember when gpus were good ? me neither
buy used
>2070 super
>2060 Super
1080p 144hz is good enough for me
>gpus barely 3yrs old dont even get driver updates
https://wccftech.com/amd-has-not-released-a-new-radeon-driver-for-rdna-2-radeon-rx-6000-gpus-in-2-months/
Nice one amd no drivers sold my 6900xt 2mo ago good timing on my part. Was gonna get a 7900xtx but im over amds issues vr and coders been broken since may last year unfixed in the new cards
AMD is always slow on driver updates, calm down.
They're just releasing these emergency drivers for the 7000 series because of stability issues. But the 6000 and 5000 series will get driver updates eventually, just be patient.
Mate they left my 6900xt broken for over 7mo and the 7900xtx two months and counting thats not slow that's glacial
>3dfx Voodoo2
>3090 Ti
I unironically bought a 3090ti a week before the 40 series launched
I bought it brand new for approximately 1300 USD
3060ti owner here someone talk me out of selling this and buying Arc 770
Do it moron.
>2080ti
>waiting for 4060 to upgrade my 1660s
wtf is this pricing? I thought mining shit is over now
>Waiting for 4060
You might as well buy 3060Ti now and save yourself disappointment.
>tfw midrange PC with 1080p and 60/144FPS depending on the game
>perfectly fine with it, only will replace it when the parts die
i stopped caring how a game looks
just if it runs well
>Game runs like shit
>But it also stutters like hell
Best combo.
3070 ti runs literally anything with ease so I really don't know why you would bother
3060 TI and its power is so fricking unnecessary. There's like 2 games,one of which is CP77 and Hogwarts Legacy that I really even need it for. Its in its own rig and this PC that i'm posting from now has a waaaaaaaaay less capable graphics card but it can handle all the games I actually play just fine so the other computer is practically a console
How's a 6700XT or 6600XT?
My old RX 480 is finally dying and not being able to run everything I throw at it smoothly and looking to upgrade. It's basically impossible to get Nvidia at a decent price where I live (UK) so I'll be passing on that
upgraded from a 1080ti i've had for 6 years to a 4070ti
Tech gays like you might cry because "MUHHH WAIT" ""MUUHHHH AMD"
but the majority is that people don't give a frick about the number of bus. Game runs at 4K60fps and that's all I ask for my card
Now frick off and go back to your mom basement
You sound like the same people who bought a 3080 10gb and was like "DURR WHO CARES IF IT HAS LESS VRAM, LESS CORES, AND IS A GIMPED CARD I PAID $1000 TO PLAY AT 1440P-4K"
kek all those 3080 10gb and 12gb owners got cucked with vram
>RX570 8GB