It's a card that literally cannot support anything other than DX12 natively. Which is a retarded design choice and was doomed to failure, why did low int-el do this?
>They're fine now, they switched to DXVK
They use DXVK for a handful of whitelisted popular titles like CS:GO. Everything else still runs like garbage. Still can't emulate the Switch on Arc either thanks to this driver bug:
https://github.com/IGCIT/Intel-GPU-Community-Issue-Tracker-IGCIT/issues/159
So no TotK for Arcucks.
More of a proof of concept than anything, anyone non retarded knows this is just to get their name out there so when they release something good, if ever, people say "oh intel got an amazing card" instead of "what the fuck intel is doing gpus now? I don't know dude"
I was thinking about upgrading my 1080ti to the 4060ti and still might do it. I've been needing to downsize my PC + power consumption.
I sold my 1440p monitor as 27" is too big so I'll be on 1080p for a long time until 24" OLED's release.
if you're unironically thinking of buying any of nvidia's products except the 4090 you're a fucking moron
every single 40 series card except the flagship has been a deliberate disaster yet you're still too fucking stupid
you're already a fag for considering itx when all the other peripherals make the pc just as unwieldy to move around as a regular atx case
you're just trying to be quirky
You have no idea of my situation, moron.
I attend LAN events weekly so I move my PC often.
I also move house relatively often due to rent prices, renting rooms, people etc.
It takes up far too much room in my car and weighs at least 18kg.
how fucking poor and weak are you if the equivalent of an average parcel is a make or break situation in terms of car storage for you
4 days ago
Anonymous
It's not make or break and I don't live in my car. I can fit everything I own into my car but it would help having a smaller PC.
My car is also relatively small (picrel, also black not yellow).
>buying a $400 1080p card with non-futureproof vram
LMAOOOOOOOOOOOOOOO
The "newest" game I've played was from 2019 but sure.
4 days ago
Anonymous
forgot pic
4 days ago
Anonymous
forgot pic
For further context...
I'm currently between a rock and a hard place. I lost my job due to not getting the COVID Vaccine and kinda just been a NEET since then. I have more than enough to survive on but I can't outright buy a home or get a loan... Nor morally do I want to spend $1M on an average looking house in shitsville.
So I've been bouncing around places a lot.
4 days ago
Anonymous
Why do you have a Benz if you’re in dire financial straits?
4 days ago
Anonymous
I have 300k in assets (invested obviously).
Can't afford a house because the goyim want $1M for an average one.
I don't have a job currently so I can't take out a loan.
4 days ago
Anonymous
Why not move to a state where you can get a good one for half that?
4 days ago
Anonymous
Then I leave my family and friends behind.
Hope you find a job soon Anon.
I'm not looking. Just enjoying life.
Was fired for not getting le'vaccine.
I could live pretty comfortable just off investment alone tbh.
What's the best best upper mid-range price to performance stable CPU+ram setup nowadays? DDR5 viable yet? 5800x3d seemed like good value, but do AMD systems still crash constantly, have USB bugs and corrupt NVMEs? Intel CPUs now use dynamic frequency too, so are they any better? I just want my PC to be stable and have good 1% lows in unoptimized games. Currently 10600k with shit overclock and tuned 4x b-die ram. Good and stable but I see CPUs are getting better and wonder what's the next optimal setup.
i had a 1080 + 6700k and due to circumstances i sold it and bought a flow x13 + a 3080 egpu, now im mainly just using the laptop itself with its 3050, im content + games that require better graphics are quite often not worth the time, i can run relatively modern games like witcher 3 and prey with a 40w gpu, its a good life.
>1070
more than good enough
i'm still running a gtx 960.
i haven't run into any game i couldn't play yet other than sons of the forest which is optimized like shit anyway so it wasn't a big surprise.
Fellow 1070 chad, only game I've had legitimate trouble with was Horizon Zero Dawn, and that's because all the particle effects crashed the framerate on High. Medium 1080p runs just fine.
newer games get harder to run even while maintaining the same resolution, you know
just because you had a 1080p card a decade ago doesn't mean you can run modern games at 1080p on it
Because it's a rebranded 4050 they're selling as a 4060ti for 400/500(16gb) usd lmao >_3050 - 106 die, 23.8% of full die cores, 276mm2, 128-bit bus, 8GB GD6, PCIe4 x8 >4060ti - 106 die, 23.6% of full die cores, 190mm2, 128-bit bus, 8GB GD6, PCIe4 x8
I'm not up to date with gpu performances
still running a 2060 and it's about time I move to a 4070 or something similar
don't care for ray tracing, but nvidia had some promising tec for DX8 or was it DX9 games remastering?
what are AMD and Intel doing at the 4070 price range?
thats why they have more features and are unmatched at in the high end market. amd is for spastic retards who think a company is their friend and willingly get scammed, enjoy waiting over a year for decent drivers
>before that it was the 3XXX series because of miners and scalpers. >before that it was because of muh memetracing and SHITLSS
Either accept you're not getting a generation like the 1000 series never ever again and just bite the bullet or quit gayming, all games are trash now anyways so what would you miss out on?
>shill told me that I'm technologically inept for thinking that the 4060Ti will perform worse for having a smaller bus width and less vram >he told me to "just wait for the benchmarks and I'll be shown how wrong I was"
I wonder where he's now...?
Every time I get a desire to play GTA 5 again, remember I'll have to download who knows how many hundreds of GB of GTAO shit I don't care about and just end playing one of the previous ones.
AMD does it pretty consistently. This benchmark is bullshit and doesn't show DLSS3 performance which is the intention of the card (to use DLSS3) but that would hurt their nvidia bad narrative.
dlss3 is literally just a form of interpolation, nothing wrong with that, its a useful tool aslong as you can apply it to an actual framerate that isnt all over the place.
I disagree on that. It allows me to run it without upscaler just with frame gen and to me that looks better.
4 days ago
Anonymous
dlss3 is literally just a form of interpolation, nothing wrong with that, its a useful tool aslong as you can apply it to an actual framerate that isnt all over the place.
Looks great!
4 days ago
Anonymous
4 days ago
Anonymous
4 days ago
Anonymous
Why you do not post the video?
You are just salty that your amd trash is behind and their frame generation will come later..
Rumour was the 40 series was going to be the "greatest generation leap ever" and all that amounted to was a performance reduction. The 30 series was "nvidia learned their lesson and provided an improvement" and that got you a 3050 that was as fast as a 1660ti for the price of a 1660ti. I can tell you right now with absolute certainty that the 50 series will be complete and utter dogshit.
we deserve this for paying $2000 for 3070's back in 2020 all the way to 2022. We will most likely get exploited for another 3-5 generations before it goes back to normal.
I bought a 3070 for MSRP + the aftermarket markup of like 50 dollars over the reference card. If you can't wait until prices are normal, its a skill issue.
There's rumors that the 4070 could get 16gb ram. Nvidia is switching it's die around to the 4080's to allow for a 256bit bus making a 16gb 4070 possible.
https://www.extremetech.com/gaming/nvidia-may-repurpose-faulty-rtx-4080-dies-for-a-new-rtx-4070
There's nothing in that tweet which suggests a better 4070. Using defective AD103 dies means they'll be cut down, including dies where the memory controller is partially defective. You'll just get an AD103 die cut down to 192-bit and the 4070 spec. Nvidia already have shader wiggle room for defective dies with the 4080, since even that's not a fully-enabled chip. The only part they can't skimp on for the 4080 is the 256-bit bus, so it makes perfect sense for those to be turned into 4070s and 4070 Tis.
Yeah I know but they could still sell more ram on the ones that don't have a defective memory controller.
Nvidia knows how stupid the 4070s looks now and they have to fix it. The thing was already not selling well at all before the 4060Ti 16g announcement.
RTX 40 series is a legit tech leap.
I'd say it should have been the GTX 10 series of the RTX line.
If they were priced appropriately and had more vram (especially for the 4070 and above) this would be the generation to get.
Look at these lower end cards.
RTX 4060 Ti 16GB and 8GB are using the 106 die, RTX 4060 and 4050 are using the 107 die
GTX 1060 6GB and 3GB used the 106 die, GTX 1050 Ti and 1050 used 107
In truth, if the new lineup looked like this:
-RTX 4060 Ti 16GB $299
-RTX 4060 Ti 8GB $249
-RTX 4050 Ti 8GB $199
-RTX 4050 6GB $149
Pretty much no one would complain. Go look at GTX 10 series pricing - even when you adjust for inflation you get nowhere near the ridiculous prices that Nvidia is asking for these cards.
They really screwed up what could have been THE generation
5000 will be nvidia's first chiplet design, there's no way it's gonna go smoothly on their first try especially when even AMD fucked up with this and they're the chiplet design masters on the CPU side of their business (and Ryzen people did work on RDNA3, they still fucked up and had to cut it's perf last minute)
On top of that it's delayed too.
I'm just gonna get a 4070 and it's gonna be fine for a few years. I'm just waiting to see if there's a Super version with more ram.
>mfw GTX970>RTX2060>RTX3080
In the case of the GTX970 I got it for $330 with Witcher 3 ($60) and MGSV ($60) making it effectively a $210 card.
In the case of the 2060 I got it b-stock from EVGA just a couple of months after release for $300. Which basically meant I bought a GTX1080 with RTX capabilities for $300. That $300 went on to carry me into 2022 due to its DLSS features. The only reason I upgraded to a 3080 is because I found one for $450.
All of that being said... the 4060ti(aka RTX4050) is absolute trash and people should be embarrassed if they even consider purchasing it.
how is 4k a meme, you can get a 4k oled for dirt cheap now and it objectively looks better. you can keep using your crappy gpu with a crappy monitor but dont justify it by calling everything else a meme
>Blzzarchud
Anyway break that conditioning a little. Even my 1060 used to run 4K native very frequently before I upgraded. All the current popular games right now run on toasters.
Have you ever tried setting your monitor to a lower resolution than native, or are you just a complete pleb and think text, icons, etc. look okay like this?
>I can clearly tell the difference between 1440p and 4k
unless you have a huge screen right in front of your face I dont think there is much difference no, certainly not worth the 25% performance difference that could be used for smoke effects, lightning, draw distance etc
nta but >insignificant gain in visual fidelity over 1440p.
You're only thinking in term of pure res, not the entire display. A 4K OLED tv will utterly destroy a cheap 1440p gayming monitor in image and especially motion quality I guarantee you.
The real meme is thinking that you need to fill all 8 millions pixels for the thing to look good when DLSS exist. Even raw 1440p on a 4K tv looks fine. You get BFI which remove motionblur entirely, and you get perfect HDR.
insignificant gain in visual fidelity my ass
>4K is a meme because it requires vastly more GPU resources to run games at 4K for an insignificant gain in visual fidelity over 1440p.
This entirely depends on screen size and viewing distance
At a monitor viewing distance the difference between 1440p and 4K at typical 1440p monitor sizes (27 and 32 in) is substantial.
Now if you have a 24 in 1440p display (or smaller) you probably have something that's crisp enough. Just like people using 1080p laptop monitors have a crisp enough screen.
you got it wrong dude, you re the atypical one, if you got a 30" monitor 20 inches in fornt of you you re sitting too close.
you got memed dude. these big ass monitor are too big for regular desk usage. just like most TVs are way too big these days. its not good for your eyes even
how is it a meme, he got a bigger display while having better pixel density than most 1440p displays. assuming your desk is large enough even a 48" can be used perfectly fine
>RX 6600 >1080p
Yep, it's gaming time.
Seriously, are people just buying 1k cards for 4k meme gaming? What games do they even play? Most games on pc need 10 year old toasters to run. Only badly coded AAA games need xx90 cards and even then they run like shit.
I upgraded my RTX3080 rig to an RTX 4080 because im rich and can. kek
I dont even need to, since aside from some VR H games all im playing can be run well over 120FPS on my 3080, like returnal.
No unfortunately.
Your best bet right now are the oled 1440p ultrawides (since they're a bit smaller in height compared to a 27") but I don't recommend buying those for many reasons.
My LG C1 with deuterium evo panel running 2 hour a day max for a year at the absolute minimum brightness still show slight burn in patterns.
But aside that, the OLED pc monitors don't have a BFI mode, which is a huge waste of the oled response times imo, so that's why I don't like them.
I have the exact same panel albeit not in a c1 and I've ran it at 80 brightness for over a year now with no burn in, I think you are lying. I've abused mine and left static windows/hud elements open for ages with 0 issues. Unless my monitor has some magical heatsink and better dimming IDK how you could physically get burn in from 2 hours of daily usage at minimum brightness, you are lying.
I wish it was a lie but playing retro games with a filter was enough to make the scanline stick around.
I think they'll clear out one day with enough pixel cleaning and it's barely visible at lower than 10% grey but I'm not lying.
>tfw bought a 2080s when Ganker was saying that it was a mistake because muh 3XXX was around the corner
Heh seems like I won big time, not changing until 6XXX
How's the RX 6800 non-xt? Man, i just want to play newer shit at high settings. I have a spare 1440p monitor, but I am still stuck on a 1660ti 2019 laptop
I'm still using a 1070, was pretty interested in getting a non TI 4070 for the good power consumption to performance ratio, as I have no intention of ever going above 1440p, frankly probably not even above 1080p, but the prices are INSANE, and until sunshine is set up properly, AMD is not an option for me.
I still don't get why you wouldn't go AMD.
If you're just a gamer and don't even really care about emulation then AMD's card are faster and cheaper and the power efficiency is good enough.
If you need ADA tier efficiency then the 4070 will be even better admittedly but you could wait for the rest of the RDNA3 cards too.
Btw the only reason you play in 4k or even in 1440p is because they don't make games that actually look good. It's better to play a 1080p+ antialiasing game that looks insane than use these resources for resolution
>buy RTX 2080 (new) for $860 AUD in March 2019 >get called a retard despite finding a deal that is $500 cheaper >2070 Super was $100 more at launch >deal of the century >new GPU's are shit and costly
Scored a gem.
I thought strongly this year about building a dream PC before the string of shitty unoptimized releases
I still thought about it for VR, but then PSVR2 came out. If you’re a consolefag a PC is not appealing atm
PSVR2 is surprisingly decent. Issue is no bc with original, thankfully everything but a handful of games have been ported and since today's showcase is supposed to focus on psvr2 I think we'll see the rest ported over.
>akshually you don't even need more than 1080p >akshually you don't even need 1080p, 360p upscaled is fine >akshually you don't even need real frames
This thread has put nvidia shills lower on the subhuman ranking than AI pajeets and cryptotards. Yes I'm aware a pie chart of these three would be more like a circle.
It's a reasonable step up from 1080p, but not at retarded 4k which has no actually beneficial technologies afforded to it's panels yet (no 1000nit hdr which is THE bare minimum for quality hdr, shit freesync compat, most are """""smart""""" panels with forced bloat to kill the panels remotely after a few years, etc)
4k is a complete joke, 1440p is bad too, much harder to find good panels than in 1080p, but 1440p at least has SOME good panels.
The real joke is anyone going for 16:10 or (lmao) 21:10.
>that tearing >that retarded size >not even 700 nits unless you're in screen tear- I mean gaming mode
yeah no. Also, it has the same bullshit I mentioned in the last part of my post, though I guess you omitted that purposefully. Enjoy when the panel takes 3 minutes to turn on because it has to load all of it's bloatware that can't be disabled. You proved me wrong though, there is technically a 4k panel with 1000nits, even if it's falsified.
Don't you get bored of always playing catch up?
15 years ago it was "he fell for the pc gaming meme", "he fell for the 60fps meme" then "he fell for the 120hz meme", "he fell for the ssd meme"
Like, yeah we get it you're a clown and you won't adopt something until it gets mass appeal like your cattle brain has been conditioned. But for a lot of people here, not only are video games a really fucking cheap hobby so spending 500$ on upgrading hardware every few months is like piss in the wind.
Agreed. There's 20 year olds that spend more on snowboarding in one season than a 4090. Money you enjoyed spending isn't wasted if you aren't a dummy in debt
i love my 1080. 7 years and still going strong. don't know why they don't include it on benchmarks ever tons of people still use it and the longevity/price/performance makes it still relevant for comparison today.
How does Nvidia keep getting away with it? Is AMD just not competing? Or it is an entrenchment problem? >well these cards at my budget suck but what else am I gonna buy? I don't wanna switch vendors now...
>Is AMD just not competing?
Whether its true or not AMD still has the reputation of failing on the software end of things while Nvidia does not and software really matters a lot when you're talking about hardware accelerated graphics.
they make most of their money on enterprise side, not retail GPUs. so they can afford to keep tossing low-effort shit at gamers. they're also going all-in on AI, and rightfully so.
>Is AMD just not competing?
they capture market share here and there, for example by offering 16GB VRAM midrange cards like 6800/6900 but AMD will always be playing catch up on innovation like DLSS, AI, etc. AMD GPUs basically exist to give you "good enough" until you can afford high end XX90 nvidia cards which will last you 3-4 years. 4090 will be good until 2028.
>bwaaaaaaah bwaaaah don't get a 4080 it's bad value for money
same price as the 7900 xtx and same performance without having to deal with amd driver bullshit
fuck the shills
>arc a770 that low
Whew....
It's a card that literally cannot support anything other than DX12 natively. Which is a retarded design choice and was doomed to failure, why did low int-el do this?
They're fine now, they switched to DXVK
and yet that benchmark is from this week and it sucks in GTA V, curious!
do you think they benchmark 20 cards for every video retard
They redid the a770 benchmarks 3 months ago after massive driver improvements. The card is just bad at running gta V.
every video, no, but there's a reason all of the cards are dated this month
no there isnt
there is
>They're fine now, they switched to DXVK
They use DXVK for a handful of whitelisted popular titles like CS:GO. Everything else still runs like garbage. Still can't emulate the Switch on Arc either thanks to this driver bug:
https://github.com/IGCIT/Intel-GPU-Community-Issue-Tracker-IGCIT/issues/159
So no TotK for Arcucks.
>why did low int-el do this?
They hired the same pajeet who shat out the worst AMD card generation in something like 2 decades.
More of a proof of concept than anything, anyone non retarded knows this is just to get their name out there so when they release something good, if ever, people say "oh intel got an amazing card" instead of "what the fuck intel is doing gpus now? I don't know dude"
the a750 and a770 are basically the same card except the a750 is 200 dollarydoos. the 770 seems to be sold through by now anyway.
No reason to upgrade from my GTX 1070 given I stick to 1080p gaming.
Also no reason to go 4K gaming. I'd rather go VR.
>no reason to go 4K gaming
It's not the res but the display's quality
I was thinking about upgrading my 1080ti to the 4060ti and still might do it. I've been needing to downsize my PC + power consumption.
I sold my 1440p monitor as 27" is too big so I'll be on 1080p for a long time until 24" OLED's release.
just get a 6700xt retard
Too large.
Too much power.
No point going from a 1080ti to a 6700XT.
140w is too much power?
I was referring to this sheet.
But yes, ideally I'd like to stay under 200W total, but form factor is more my issue.
FUCK YO 200W, bro
lmao i just went from a 1080ti to a 13900k and a 4090
Also no NVENC support (obviously)
if you're unironically thinking of buying any of nvidia's products except the 4090 you're a fucking moron
every single 40 series card except the flagship has been a deliberate disaster yet you're still too fucking stupid
>drop 70% in power consumption
>be able to use a SFF case
that is why reasoning, moron.
And no, I will not get a fagman laptop.
you're already a fag for considering itx when all the other peripherals make the pc just as unwieldy to move around as a regular atx case
you're just trying to be quirky
You have no idea of my situation, moron.
I attend LAN events weekly so I move my PC often.
I also move house relatively often due to rent prices, renting rooms, people etc.
It takes up far too much room in my car and weighs at least 18kg.
how fucking poor and weak are you if the equivalent of an average parcel is a make or break situation in terms of car storage for you
It's not make or break and I don't live in my car. I can fit everything I own into my car but it would help having a smaller PC.
My car is also relatively small (picrel, also black not yellow).
The "newest" game I've played was from 2019 but sure.
forgot pic
For further context...
I'm currently between a rock and a hard place. I lost my job due to not getting the COVID Vaccine and kinda just been a NEET since then. I have more than enough to survive on but I can't outright buy a home or get a loan... Nor morally do I want to spend $1M on an average looking house in shitsville.
So I've been bouncing around places a lot.
Why do you have a Benz if you’re in dire financial straits?
I have 300k in assets (invested obviously).
Can't afford a house because the goyim want $1M for an average one.
I don't have a job currently so I can't take out a loan.
Why not move to a state where you can get a good one for half that?
Then I leave my family and friends behind.
I'm not looking. Just enjoying life.
Was fired for not getting le'vaccine.
I could live pretty comfortable just off investment alone tbh.
Hope you find a job soon Anon.
4080 has been working well for me and was $1000 AUD cheaper than the $4090.
don't do it, it's such a bad deal
>buying a $400 1080p card with non-futureproof vram
LMAOOOOOOOOOOOOOOO
What if I don't care about graphics and want a stable 90-144fps?
then you buy amd
nvidia is for ray tracing, if u dont care about graphics....
nvidia also has nvenc and dlss, idk why anyone would buy amd unless they only play at 1080p for some weird self torture reason
>DLSS UNDER 4k
>playing 4k with a 4060 ti
lol nice argument
>nvenc
for???? your 1 viewer (your mom) stream?/
Wait on 3rd party benchmarks for the 16GB or hell the regular ass version, both of which offer better trade offs compared to the piss pour 8gb.
I got a 6700XT a couple weeks ago. Im fine for a few years.
Based. Sapphire Pulse 6700XT here and it's great.
>1070 + 6700k
>NEET atm
I don't think I can last much longer bros...
980 Ti and a 6600k here. Finally ordered the parts to upgrade. 4070 Ti and 13700k.
I went from a 1060 6GB / 8700k to 4070 Ti and 13700k. Loving it so far.
What's the best best upper mid-range price to performance stable CPU+ram setup nowadays? DDR5 viable yet? 5800x3d seemed like good value, but do AMD systems still crash constantly, have USB bugs and corrupt NVMEs? Intel CPUs now use dynamic frequency too, so are they any better? I just want my PC to be stable and have good 1% lows in unoptimized games. Currently 10600k with shit overclock and tuned 4x b-die ram. Good and stable but I see CPUs are getting better and wonder what's the next optimal setup.
i had a 1080 + 6700k and due to circumstances i sold it and bought a flow x13 + a 3080 egpu, now im mainly just using the laptop itself with its 3050, im content + games that require better graphics are quite often not worth the time, i can run relatively modern games like witcher 3 and prey with a 40w gpu, its a good life.
>1070
more than good enough
i'm still running a gtx 960.
i haven't run into any game i couldn't play yet other than sons of the forest which is optimized like shit anyway so it wasn't a big surprise.
Fellow 1070 chad, only game I've had legitimate trouble with was Horizon Zero Dawn, and that's because all the particle effects crashed the framerate on High. Medium 1080p runs just fine.
how is that even possible
128 bit memory bus, so it shits itself above 1080p
Tfw you got a GTX 1060 with 192bits bus and you're still good for years
4K
Smaller bus width
It's why they have been marketing it as a 1080p card.
>marketing a GPU as a 1080p card in 2023
LMAO
Realistically, 1080p will never be obsolete as long as there are flat panels.
Yes. 77% of people are still on 1080p.
You don't need more.
Then you don't need a new GPU either.
I do to play at 240hz
newer games get harder to run even while maintaining the same resolution, you know
just because you had a 1080p card a decade ago doesn't mean you can run modern games at 1080p on it
garden gnome engeneering
Because it's a rebranded 4050 they're selling as a 4060ti for 400/500(16gb) usd lmao
>_3050 - 106 die, 23.8% of full die cores, 276mm2, 128-bit bus, 8GB GD6, PCIe4 x8
>4060ti - 106 die, 23.6% of full die cores, 190mm2, 128-bit bus, 8GB GD6, PCIe4 x8
no fucking way
>128 bit
what the fuck were they thinking
I'm not up to date with gpu performances
still running a 2060 and it's about time I move to a 4070 or something similar
don't care for ray tracing, but nvidia had some promising tec for DX8 or was it DX9 games remastering?
what are AMD and Intel doing at the 4070 price range?
My 2080ti still serves me well.
RTX 3090 is the desktop
6900XT in the Loonix HTPC
Life is good bros.
Okay but.. why would you need a card that powerful in a HTPC?
I also game on it, 75" 4K 120hz VRR TV
Also because I can.
Currently playing new Zelda on Yuzu 🙂
Sweet. Have fun, Anon.
So stick with 3060, is that your message if you don't like it?
Nvidia don't give a fuck about gaymers. They are riding the AI train.
thats why they have more features and are unmatched at in the high end market. amd is for spastic retards who think a company is their friend and willingly get scammed, enjoy waiting over a year for decent drivers
No my gtx1650 is enough to run every game and the game that doesn't run are dogshit like tlou
>tfw 5600 XT
I should still be fine... r-right?
>6 GB VRAM
ngmi
dogshit poorfag gpu
same as me, i will not upgrade since i dont play games but i think you will be fine
I knew from the moment the 4080 12gb was announced that this was the easiest skip gen we'll ever get.
>before that it was the 3XXX series because of miners and scalpers.
>before that it was because of muh memetracing and SHITLSS
Either accept you're not getting a generation like the 1000 series never ever again and just bite the bullet or quit gayming, all games are trash now anyways so what would you miss out on?
you’re supposed to buy every other gen
>shill told me that I'm technologically inept for thinking that the 4060Ti will perform worse for having a smaller bus width and less vram
>he told me to "just wait for the benchmarks and I'll be shown how wrong I was"
I wonder where he's now...?
Nah i think my 1050 ti is good enough
cinematic 1080p experience at low settings, truly how the developers intended
Yes
Every time I get a desire to play GTA 5 again, remember I'll have to download who knows how many hundreds of GB of GTAO shit I don't care about and just end playing one of the previous ones.
>Generation leap
>Same msrp
>Worse performance
This is a first, right?
AMD does it pretty consistently. This benchmark is bullshit and doesn't show DLSS3 performance which is the intention of the card (to use DLSS3) but that would hurt their nvidia bad narrative.
>DLSS3
DLSS3 is dogshit and I'm so happy reviewers are ignoring it
>This shit again
DLSS literally makes most games look better.
DLSS 2 makes games look better. 3 is a mess
dlss3 is literally just a form of interpolation, nothing wrong with that, its a useful tool aslong as you can apply it to an actual framerate that isnt all over the place.
Interpolation is trash.
I disagree on that. It allows me to run it without upscaler just with frame gen and to me that looks better.
Looks great!
Why you do not post the video?
You are just salty that your amd trash is behind and their frame generation will come later..
Damn look at them titties
>Buy the same hardware for the software update.
3060 is the new 1060, most popular in steam survey for years to come
maybe when it gets a price drop.
It's still way too fucking expensive for midrange gaming
40 series is trash just like the 20 series. Rumour is the 50 series will be an even bigger jump than 20 to 30 was.
Rumour was the 40 series was going to be the "greatest generation leap ever" and all that amounted to was a performance reduction. The 30 series was "nvidia learned their lesson and provided an improvement" and that got you a 3050 that was as fast as a 1660ti for the price of a 1660ti. I can tell you right now with absolute certainty that the 50 series will be complete and utter dogshit.
Will it ever get better or is pc building just fucked forever?
new gpu? what for?
we deserve this for paying $2000 for 3070's back in 2020 all the way to 2022. We will most likely get exploited for another 3-5 generations before it goes back to normal.
>we
I bought a 3070 for MSRP + the aftermarket markup of like 50 dollars over the reference card. If you can't wait until prices are normal, its a skill issue.
5000 series when?
Delayed to make you buy the current trash
Glad that i did buy the rtx 4070 despite of the rumors about 16gb rtx 4060ti.
There's rumors that the 4070 could get 16gb ram. Nvidia is switching it's die around to the 4080's to allow for a 256bit bus making a 16gb 4070 possible.
https://www.extremetech.com/gaming/nvidia-may-repurpose-faulty-rtx-4080-dies-for-a-new-rtx-4070
There's nothing in that tweet which suggests a better 4070. Using defective AD103 dies means they'll be cut down, including dies where the memory controller is partially defective. You'll just get an AD103 die cut down to 192-bit and the 4070 spec. Nvidia already have shader wiggle room for defective dies with the 4080, since even that's not a fully-enabled chip. The only part they can't skimp on for the 4080 is the 256-bit bus, so it makes perfect sense for those to be turned into 4070s and 4070 Tis.
Yeah I know but they could still sell more ram on the ones that don't have a defective memory controller.
Nvidia knows how stupid the 4070s looks now and they have to fix it. The thing was already not selling well at all before the 4060Ti 16g announcement.
Why is a 2013 game used in 2023 benchmarks?
Why 3060 8GB is better than 12GB?
asking for friend.
It's the 3060 "TI" that's better.
1070 CHADS, WE STAY WINNING
Reminder to never ever buy the in between must skip gens
900 series, 2000 series, 4000 series
they're always full of problems, gnomish shit or other bullshit, they're filler releases
God, that time I bought 900 series when 1000 just released...
The 30 series was also awful. There is no pattern here just nvidia consistently sucking ass.
30 series was good and it made the 2000 series look like trash, it just came out at the worst possible time.
there is no alternative albeit
RTX 40 series is a legit tech leap.
I'd say it should have been the GTX 10 series of the RTX line.
If they were priced appropriately and had more vram (especially for the 4070 and above) this would be the generation to get.
Look at these lower end cards.
RTX 4060 Ti 16GB and 8GB are using the 106 die, RTX 4060 and 4050 are using the 107 die
GTX 1060 6GB and 3GB used the 106 die, GTX 1050 Ti and 1050 used 107
In truth, if the new lineup looked like this:
-RTX 4060 Ti 16GB $299
-RTX 4060 Ti 8GB $249
-RTX 4050 Ti 8GB $199
-RTX 4050 6GB $149
Pretty much no one would complain. Go look at GTX 10 series pricing - even when you adjust for inflation you get nowhere near the ridiculous prices that Nvidia is asking for these cards.
They really screwed up what could have been THE generation
Correction*
I meant 4060 16GB $299 and 4060 8GB $249
4060 Ti shouldn't exist at all
5000 will be nvidia's first chiplet design, there's no way it's gonna go smoothly on their first try especially when even AMD fucked up with this and they're the chiplet design masters on the CPU side of their business (and Ryzen people did work on RDNA3, they still fucked up and had to cut it's perf last minute)
On top of that it's delayed too.
I'm just gonna get a 4070 and it's gonna be fine for a few years. I'm just waiting to see if there's a Super version with more ram.
>mfw GTX970>RTX2060>RTX3080
In the case of the GTX970 I got it for $330 with Witcher 3 ($60) and MGSV ($60) making it effectively a $210 card.
In the case of the 2060 I got it b-stock from EVGA just a couple of months after release for $300. Which basically meant I bought a GTX1080 with RTX capabilities for $300. That $300 went on to carry me into 2022 due to its DLSS features. The only reason I upgraded to a 3080 is because I found one for $450.
All of that being said... the 4060ti(aka RTX4050) is absolute trash and people should be embarrassed if they even consider purchasing it.
People will whine about the dumbest shit.
JUST FUCKING BUY IT ALREADY, WE KNOW YOU WANT TO
>4K is a meme
>no good new games
Guess I'll stick with the 1080ti for another 5 years.
how is 4k a meme, you can get a 4k oled for dirt cheap now and it objectively looks better. you can keep using your crappy gpu with a crappy monitor but dont justify it by calling everything else a meme
Yes, enjoy paying $4,000 AUD every 2 years just to be able to play 4K 120 fps.
Nobody is forcing you to play at 4K or 120fps, or both at the same time
Nobody is forcing to play AAA slop.
yeah let me play diablo 1 on my 4k monitor
>Blzzarchud
Anyway break that conditioning a little. Even my 1060 used to run 4K native very frequently before I upgraded. All the current popular games right now run on toasters.
lol kys autist
ywnbaw chud
Im playing at 4k 120fps just fine with a 4080, cost me like 1.7k aud, wtf are you talking about
4K is a meme because it requires vastly more GPU resources to run games at 4K for an insignificant gain in visual fidelity over 1440p.
This means you might be sacrificing 100+ fps just to run the game at 4K instead of 1440p
Also it requires HiDPI scaling just to use your desktop normally, or things would be way too small.
you know you can have your desktop in 1080p and play games in higher resolution right?
>Bilinear scale your desktop bro
Have you ever tried setting your monitor to a lower resolution than native, or are you just a complete pleb and think text, icons, etc. look okay like this?
my 4k TV is running in desktop in 1080 right now and there is literally nothign wrong with it. 4k is way too small, unreadable
I can clearly tell the difference between 1440p and 4k, I can't clearly tell the difference between 120hz and say 144-200hz
>I can clearly tell the difference between 1440p and 4k
unless you have a huge screen right in front of your face I dont think there is much difference no, certainly not worth the 25% performance difference that could be used for smoke effects, lightning, draw distance etc
nta but
>insignificant gain in visual fidelity over 1440p.
You're only thinking in term of pure res, not the entire display. A 4K OLED tv will utterly destroy a cheap 1440p gayming monitor in image and especially motion quality I guarantee you.
The real meme is thinking that you need to fill all 8 millions pixels for the thing to look good when DLSS exist. Even raw 1440p on a 4K tv looks fine. You get BFI which remove motionblur entirely, and you get perfect HDR.
insignificant gain in visual fidelity my ass
>Also it requires HiDPI scaling just to use your desktop normally
I use the normal 100% scaling on windows no problem
>4K is a meme because it requires vastly more GPU resources to run games at 4K for an insignificant gain in visual fidelity over 1440p.
This entirely depends on screen size and viewing distance
At a monitor viewing distance the difference between 1440p and 4K at typical 1440p monitor sizes (27 and 32 in) is substantial.
Now if you have a 24 in 1440p display (or smaller) you probably have something that's crisp enough. Just like people using 1080p laptop monitors have a crisp enough screen.
you got it wrong dude, you re the atypical one, if you got a 30" monitor 20 inches in fornt of you you re sitting too close.
you got memed dude. these big ass monitor are too big for regular desk usage. just like most TVs are way too big these days. its not good for your eyes even
how is it a meme, he got a bigger display while having better pixel density than most 1440p displays. assuming your desk is large enough even a 48" can be used perfectly fine
How bad is your eyesight?
4k was always a meme. Imagine turning your computer into an oven just to play a simple video game all the time.
My PC runs cool, as does my house. See
.
Nobody cares, your name is fucking dumb and you are a loser.
Not enough of a loser to be poor, though. Also, you probably go by Deathsniper48239042, broccoli head.
>b-b-but you zoomer
Wrong; eat shit ''''toiletduk''''' (zomg so randum)
It's not random at all, and it has a history, unlike your 19 years on earth.
Nvidia will only stop with with this level of scamming when people stop buying their cards.
>3090
>1080p
>tfw don't have to upgrade anymore
Feels gud man
>RX 6600
>1080p
Yep, it's gaming time.
Seriously, are people just buying 1k cards for 4k meme gaming? What games do they even play? Most games on pc need 10 year old toasters to run. Only badly coded AAA games need xx90 cards and even then they run like shit.
and look like shit
What games are people even buying these cards for?
stable diffusion
can't stop cooming
I use them for Blender
>1080ti
>1080p 240hz
>don't play new games
yep, im set.
i don't care about 4k, fuckoff with this shit. i will play games at 1080p for eternity at this rate
I upgraded my RTX3080 rig to an RTX 4080 because im rich and can. kek
I dont even need to, since aside from some VR H games all im playing can be run well over 120FPS on my 3080, like returnal.
why bother nothing good came/comes out any time soon
this. plus all modern games look like fucking trash. I could probably sit on my 1060 for another 5 years. which i will probably do
Honestly, and this isn't even cope, you can game with any mid-range card from the past six years.
But unrelated but any news of 24" OLED panels?
24" is so tiny, man up bro and go 32" - 48" already
I don't like any larger.
I've tried 27" but hated it. Forced myself for a month and still didn't like it.
No unfortunately.
Your best bet right now are the oled 1440p ultrawides (since they're a bit smaller in height compared to a 27") but I don't recommend buying those for many reasons.
What are your many reasons because they seem perfectly fine, if you say burn in you are retarded and still think this is like 2012
My LG C1 with deuterium evo panel running 2 hour a day max for a year at the absolute minimum brightness still show slight burn in patterns.
But aside that, the OLED pc monitors don't have a BFI mode, which is a huge waste of the oled response times imo, so that's why I don't like them.
I have the exact same panel albeit not in a c1 and I've ran it at 80 brightness for over a year now with no burn in, I think you are lying. I've abused mine and left static windows/hud elements open for ages with 0 issues. Unless my monitor has some magical heatsink and better dimming IDK how you could physically get burn in from 2 hours of daily usage at minimum brightness, you are lying.
I wish it was a lie but playing retro games with a filter was enough to make the scanline stick around.
I think they'll clear out one day with enough pixel cleaning and it's barely visible at lower than 10% grey but I'm not lying.
DX11 is nearly 15 years old. I imagine gtav was rewritten for dx12 to add ray tracing but they skipped adding it to pc because rockstar hates pc
I upgraded to a 6700 xt last year and now I need to upgrade AGAIN because I want more POWER what the FUCK
yea upgrade for less powerful card very good
U really need to work on your reading comprehension lol lmao
still haven't seen more than 2 games that absolutely require a new GPU
simply not worth it, and it just means a more expensive power bill
>fell for the 4k meme
>4k is a meme
>t. never played a game in 4k
>tfw 1080p
>tfw running 4k downsampled whenever I feel like it
I will ride out this 2070S until at least the 7xxx series from nvidiot. Only wish I had went with a 2080S or 2080ti at the time.
>tfw bought a 2080s when Ganker was saying that it was a mistake because muh 3XXX was around the corner
Heh seems like I won big time, not changing until 6XXX
How's the RX 6800 non-xt? Man, i just want to play newer shit at high settings. I have a spare 1440p monitor, but I am still stuck on a 1660ti 2019 laptop
I'm still using a 1070, was pretty interested in getting a non TI 4070 for the good power consumption to performance ratio, as I have no intention of ever going above 1440p, frankly probably not even above 1080p, but the prices are INSANE, and until sunshine is set up properly, AMD is not an option for me.
>AMD is not an option for me.
why
Try reading the post again, I said exactly why not even 10 words prior.
I still don't get why you wouldn't go AMD.
If you're just a gamer and don't even really care about emulation then AMD's card are faster and cheaper and the power efficiency is good enough.
If you need ADA tier efficiency then the 4070 will be even better admittedly but you could wait for the rest of the RDNA3 cards too.
Btw the only reason you play in 4k or even in 1440p is because they don't make games that actually look good. It's better to play a 1080p+ antialiasing game that looks insane than use these resources for resolution
I play at 1080p on a 4K tv with integer scaling.
The improvement to detail with 4k and hdr is very impressive. But it will cost you
>buy RTX 2080 (new) for $860 AUD in March 2019
>get called a retard despite finding a deal that is $500 cheaper
>2070 Super was $100 more at launch
>deal of the century
>new GPU's are shit and costly
Scored a gem.
I thought strongly this year about building a dream PC before the string of shitty unoptimized releases
I still thought about it for VR, but then PSVR2 came out. If you’re a consolefag a PC is not appealing atm
PSVR2 is surprisingly decent. Issue is no bc with original, thankfully everything but a handful of games have been ported and since today's showcase is supposed to focus on psvr2 I think we'll see the rest ported over.
>akshually you don't even need more than 1080p
>akshually you don't even need 1080p, 360p upscaled is fine
>akshually you don't even need real frames
This thread has put nvidia shills lower on the subhuman ranking than AI pajeets and cryptotards. Yes I'm aware a pie chart of these three would be more like a circle.
You need 4k, and I have THE Nvidia card.
all of these are true, unironically
>people actually fell for the 1440p meme
I'd fuck him.
It's a reasonable step up from 1080p, but not at retarded 4k which has no actually beneficial technologies afforded to it's panels yet (no 1000nit hdr which is THE bare minimum for quality hdr, shit freesync compat, most are """""smart""""" panels with forced bloat to kill the panels remotely after a few years, etc)
4k is a complete joke, 1440p is bad too, much harder to find good panels than in 1080p, but 1440p at least has SOME good panels.
The real joke is anyone going for 16:10 or (lmao) 21:10.
I have 1000nit HDR on my 4k monitor, and freesync ultimate. Lie some more.
what monitor do you have
please do share.
Odyssey Ark 4k.
>that tearing
>that retarded size
>not even 700 nits unless you're in screen tear- I mean gaming mode
yeah no. Also, it has the same bullshit I mentioned in the last part of my post, though I guess you omitted that purposefully. Enjoy when the panel takes 3 minutes to turn on because it has to load all of it's bloatware that can't be disabled. You proved me wrong though, there is technically a 4k panel with 1000nits, even if it's falsified.
I don't see any tearing with VRR.
>non integer scaling for all media
Don't you get bored of always playing catch up?
15 years ago it was "he fell for the pc gaming meme", "he fell for the 60fps meme" then "he fell for the 120hz meme", "he fell for the ssd meme"
Like, yeah we get it you're a clown and you won't adopt something until it gets mass appeal like your cattle brain has been conditioned. But for a lot of people here, not only are video games a really fucking cheap hobby so spending 500$ on upgrading hardware every few months is like piss in the wind.
if you're an old fuck, aka mid 20s to late 20s and beyond, and you can't afford a RTX 4090, you basically failed in life
>if you're an old fuck, aka mid 20s to late 20s
...the absolute state of Ganker
Most people are bald, obese or have a much higher BMI than they should, developed several health issues and have shit knees by their late 20s.
Agreed. There's 20 year olds that spend more on snowboarding in one season than a 4090. Money you enjoyed spending isn't wasted if you aren't a dummy in debt
and if you do it every gpu cycle you just sell your old one which usually covers at least 50% or more of the new one
being poor is a skill issue
Wasting all your money on useless shit is how you become poor
When a 4090 takes all your money, you're already poor.
>all your money
U need a better job foo
Please don't tell me you're one of those idiots from the 1980s who thinks poor people are poor because they have too many color TVs lmao
I do actually yeah because I've seen them do it, it's just changed from color TVs to 120Hz VRR ones.
You do not understand poors at all if you think they even know what vrr is
But I'm poor according to all those seething replies and I do?
You're poor because of a lack of effort not iq.
bro if a trivial expense like $1-2k every 2 years brings you to the verge of poverty, you're already down bad
>$3000 is a lot
either kid or poorfag
U gonna get blasted for this dipshit post but don't listen to them: you're so right, dude
>if you're an old fuck, aka mid 20s to late 20s and beyond
holy shit Ganker is dead.
>everyone lives in the first world
>tfw still running a rx580
was hoping the 4060 was going to drive down 3060 prices...
i love my 1080. 7 years and still going strong. don't know why they don't include it on benchmarks ever tons of people still use it and the longevity/price/performance makes it still relevant for comparison today.
Really wish I bought it back then instead of the 1060.
Still running with Sapphire Vega64
Nice. AMD should've stuck to HBM
They still use it for most of their cards except rdna gayman models iirc
None would believe today that card was 300$ in 2019
It's up.
>an actual budget card
looks good tbh. I might finally switch to AMD
No, I want a new display, and I'm having trouble justifying buying it. 1000 USD for a damn monitor.
>R9 280X
How does Nvidia keep getting away with it? Is AMD just not competing? Or it is an entrenchment problem?
>well these cards at my budget suck but what else am I gonna buy? I don't wanna switch vendors now...
>Is AMD just not competing?
Whether its true or not AMD still has the reputation of failing on the software end of things while Nvidia does not and software really matters a lot when you're talking about hardware accelerated graphics.
>How does Nvidia keep getting away with it?
they make most of their money on enterprise side, not retail GPUs. so they can afford to keep tossing low-effort shit at gamers. they're also going all-in on AI, and rightfully so.
>Is AMD just not competing?
they capture market share here and there, for example by offering 16GB VRAM midrange cards like 6800/6900 but AMD will always be playing catch up on innovation like DLSS, AI, etc. AMD GPUs basically exist to give you "good enough" until you can afford high end XX90 nvidia cards which will last you 3-4 years. 4090 will be good until 2028.
No one will serve jail time for the 4060 Ti's release.
don't like it don't buy it
>bwaaaaaaah bwaaaah don't get a 4080 it's bad value for money
same price as the 7900 xtx and same performance without having to deal with amd driver bullshit
fuck the shills