Recession also heavily crippled the PC gaming industry hence why very few games were being made exclusively for the PC after 2008. Console ports with high res packs don't count
You should take the 1440p pill. The first thought when I made the jump from my 1080p monitor to a 1440p was "wow, I wish I would have switched sooner". Plus now is a really good time to make the upgrade. You can get a solid 1440p 144/165hz monitor for like 200-300 bucks, where as just a few years ago those specs were in the 500 bucks ballpark.
>"wow, I wish I would have switched sooner"
literally explains nothing, switching to two monitors is alright because you increase your workspace, but going from 1080p to 1440p can give you more problems since a lot of UI stuff isn't suited for this resolution yet
Nah, 1440p has fine UI scaling for the vast majority of worthwhile games. It's 4K where things start to get a little fricky.
Can someone explain me this meme of 166hz and bullshit like that? Whats the advantage? 60fps looks fluid enough i just dont get it this is one of the most moronic things i have ever seen...... is it because pros need 200 extra fps to fill their autism? Dude all the good players play on low end pc's and they are good because they learned the game not because some magical fps trickery bullshit..... the 60+ fps thing is not even worth it for singleplayer games too is just ...... extra fps i dont get it.
60fps/hz is not smooth at all. 144hz just feels a lot nicer in everything you do, even desktop and productive stuff. Though beyond 144hz is where you rapidly start hitting diminishing returns.
Can someone explain me this meme of 166hz and bullshit like that? Whats the advantage? 60fps looks fluid enough i just dont get it this is one of the most moronic things i have ever seen...... is it because pros need 200 extra fps to fill their autism? Dude all the good players play on low end pc's and they are good because they learned the game not because some magical fps trickery bullshit..... the 60+ fps thing is not even worth it for singleplayer games too is just ...... extra fps i dont get it.
60 fps seems fluid only if you can ignore the blatant visual tearing in any game with fast moving objects and cameras
I'm not even a gay into competitive fps even though I wanted to get gud I just quickly realized that was gay and went onto playing singleplayer games
personally I just like it better everything is smoother and even scrolling with your internet
browser is just way better
it's just simply better you only need 240hz+ if you're a homosexual or play pro
144hz is legit. Smooth turning on movement just makes every game feel better to play, and by a small way look better.
240hz is a meme though and 360hz kinda is too.
Because its an Nvidia tech demo feature that they added with the 20 series and included dedicated silicon for on the cards. All AMD could do to add support was make a way for thier cards to handle the same feature without anything specially made to handle it. In another 2 gens they may start having dedicated RT silicon on thier cards if no better method for calculating the raysgets created.
3060ti can almost any game at 1440p + 60fps nowadays, even metro exodus you can run at 1440p + all maxed + full RTX at 70~80 fps. But in 2 years it will become a 1080p card obviously.
>But in 2 years it will become a 1080p card obviously.
Yes so this is exactly why 1440p is a trap to make you consume more. You get that "I just can't move back to 1080p" feeling.
It is also the part which is reliant on all other parts so what you're saying is just dumb okay. If it is so important why didn't you go for 4k? You know it is o b j e c t i v e l y higher quality right? What are you poor? You know you stare at it? Right? hurdur
Ray tracing isn't something you buy. It's just something that comes up with the current generation cards. Or are you saying that everyone who bought any RT capable card is an idiot?
It's a matter of getting used to it. I thought that too at first but ofc it will feel big if you're used to 24" and have used them exclusively for x amount of years.
Nah I really tried. I dropped $1k on a 1440p display in 2018. Used it for 3 months before selling it. I really tried and wanted to like it but I couldn't.
Perhaps you are right.
Also felt like there was more input lag which I didn't like either. I never got used to that in 3 months.
for me it's the upcoming 23.8" 1440p 144hz+ IPS monitors
27" is too big for fps games and 1080p 24" looks pretty bad after being on 1440p monitor/4k TV for a few years
IMO 24" 1440p is cringe. Too much DPI for the screensize. Additional input lag with no real gain in visibility or visuals and a fked FOV that will have you running at 80 just to see shit.
If you had it you would notice immediately, either I had shit luck or you had good luck. Or maybe because refunds are so ez in my country that companies sit on returned broken shit and ship it out as new pretending the product isn't faulty.
3080ti here, 1080p all day, undervolted, liquid cooling, racked in a cooling unit(refrigeration) with ventilation so I get no leaking. Intend to stretch this b***h all the way to 2030. I do not give a frick about modern graphix, I don't buy AAA games.
my advice is to wait for meteor lake cpus. I believe they will be ddr5 only.
I bought a i5 12600k cpu. avoid more powerful intel cpus like the i712700k. they run very hot. they are poorly engineered imo.
No. In general 1080p and 60fps is more than plenty for just about any game. Going above those values instantly doubles, sometimes outright quadruples the system requirements.
I wanna get a 3060 ti and a g-sync 1440p monitor in the possibly near future when both get cheaper as the new gpu's release
what is a good 1440p monitor to keep in mind?
The M32Q is one of the rare acceptable 1440p monitors along with the blurbuster certified ones like the Eve series (who are too expensive and hell to order)
https://www.rtings.com/monitor/reviews/gigabyte/m32q
What's the response times like?
I see a higher than usual GTG but a very low total response. Personally I haven't ever seen a monitor like that, only very low GTG and mediocre total response.
No he isnt, DLDSS exists for a reason, i might not be 100% the same as a native 1440p but its good enough and a 3060ti will struggle to run 1440p games in 1~2 years, so unless he plans to be a consoooomer and change his video card every 2 years, he's fine with a 1080 display.
Is 1440p actually a viable resolution going forward or is it just a stopgap between 1080p and 4K the same way 1440x900 and 1680x1050 were stopgaps between 720p and 1080p?
Of course it's viable as long it's 16:9
You should be worried about the motion resolution and performance instead. PC monitors lost it completely compared to TVs, that's the bigger issue.
1440p isn't a stop gap it's just a simple alternative that lets you run games at higher fps for longer while still getting a detail increase
There's no reason to go with 4k until it's a joke for gpus to even render
>There's no reason to go with 4k
As a resolution, yes. But there are other reasons.
You go to 4K TVs for their far superior image quality in SDR and HDR and motion resolution quality. Especially the OLED ones. They absolutely can't be beat.
But yes you'll pay the price in GPU. Although, there's many tricks that works well enough, between DLSS, FSR, simply dealing with lower resolution scaling, or using an ultrawide resolution on them to save even more GPU power.
t. run 4K with a 1060 just fine
It works absolutely fine though. you do realize you're not forced to use native resolution at gunpoint if you have a 4K display, right?
I'm gonna upgrade soon enough. But it works. Adjusting your GPU load is incredibly easy.
M32Q is incredibly based for the price.
Yes it's really good and has an fantastic BFI sync option. Absolutely no motionblur above 120fps VRR.
And btw my favorite games all run at 4K full speed. Not to mention emulation and older games which is most of my gaming anyway.
I know it sounds crazy but if you have even a bit of experience with tools like Afterburner, you can instantly see your GPU utilization and plan accordingly. I have to go under 1440p for the image to start looking like shit. It happens a lot for recent games, of course and there isn't much you can do about it, turning settings down help a lot.
Really I'm just waiting for AM5. Nothing to upgrade to until then, CPU performance can be a bigger bottleneck than GPU bottlenecks.
I'm also 3070ti 1080p and can't lose the feeling that I should have gotten 3060ti instead... what games do you play? Apex and Doom are the games I get most out of the card but I keep playing so many older titles I feel like an idiot.
VR
Some singleplayer titles(including RDR2, almost finished it, played pathologic 2 before at 900p@37 fps because it's really shitty optimized)
ASShomosexualS(playing too much TFT lol)
If you use any cpu made in the last 3 years, other than intel office work pentium 2 core shit, then that will be enough for esports high refresh gaming.
No it wont. For example I played few Steam demos during last demo thingy and some of them would dip to 100. And then there's games like Vermintide, Dota 2.. Even Team Fortress 2 runs like ass because the code is what it is...
Is it worth it to upgrade the entire kit and caboodle now or just wait? We're on a cusp of a generation jump but pricing is actually looking somewhat reasonable for most components unless you're into the DDR5 shitshow.
Well do you feel like you need to upgrade right now? That's always the question to ask yourself. If you do, then go for it. Otherwise you'll get stuck in the perpetual "just wait bro" cycle.
I got the itch since its been 6 years. I got a 1080 still trucking and can play everything I want to though. My excuse is that I recently lucked out and got a super ultra wide meme monitor for free and I want to actually get higher framerates at full rez.
For now yes, but give it a year or two and you'll be just fine with 1080p. >xx80 card on release
1440p/144hz >xx80 card 2 years after release
1440p/60+ hz >xx80 card 4 years after release
struggling with 1080p
and you can subtract 1 year for every class down, 70, 60, 50 etc.
Why not buy some clothes or accessories so you can finally lose your virginity instead of buying graphics cards so you can have a higher resolution that you never notice anyway?
anything faster than a 3070 uses gddr6x memory, which runs very hot and will probably brick your gpu right after the warranty ends.
i got the 3070 because it uses the cooler running gddr6 ram.
Worth upgrading from a faulty RX 5700 XT to this? AMD will never have a good video encoder so I want to be blessed by the NVENC gods in the near future.
I did everything, from DDU, to cleanup utility, to daisychaining, to upgrading psu, case and dual channel RAM and nothing seems to help. It’s fricking tragic if video editing is your side hustle.
I would definitely say go ahead with the switch over then. If you're making money on this then push for a 3080 at least though unless your psu can't can't handle that.
>play on 4K TV >1080p (integer scaled) looks good >anything between 1080p and 4K (lanczos + sharpening) looks good >4K looks good >upscaling methods actually work well enough >people keep calling it cope
I'm glad not being tech illiterate
There's a justified belief that non-native upscaled resolutions look like ass on LCD due to upscaling blur, but at 4K, resolution above 1080p already look pretty well in my opinion. You can negate the upscaling blur by using lanczos upscaling (hidden Nvidia Control Panel setting, it results with slightly sharper image) instead of bilinear upscaling (the default one). Then you can apply some sharpening, using Reshade for example, to further make the image more clear.
FSR 1.0 does similar thing, but natively and better.
And integer scaling makes it so there's no upscaling blur, every pixel is multiplied and has the color it should have. It's ideal for 1080p to 4K, image looks exactly like it would on a native 1080p screen.
>Realizing I have a 4k monitor for nothing besides a few PS5 games (running at 30 or 40 fps)and will probably run my pc and series s and all my other ps5 games on my 1080p monitor whenever I get back home
No, because you can DSR at higher resolutions, and get the mother of all AA. For old games, it's gold.
No because 1440p is a meme
nice cope poorgay
>1080p
It's not 2011 anymore
>It's not 2011 anymore
Unfortunately.
And that's why everything sucks.
2011 was equally shit zoomzoom. Bring me back to 2007 before the recession and before PC gaming became nothing but console ports.
iPhone and social media had catastrophic effects for not just video games but the society as a whole.
Recession also heavily crippled the PC gaming industry hence why very few games were being made exclusively for the PC after 2008. Console ports with high res packs don't count
You should take the 1440p pill. The first thought when I made the jump from my 1080p monitor to a 1440p was "wow, I wish I would have switched sooner". Plus now is a really good time to make the upgrade. You can get a solid 1440p 144/165hz monitor for like 200-300 bucks, where as just a few years ago those specs were in the 500 bucks ballpark.
>"wow, I wish I would have switched sooner"
literally explains nothing, switching to two monitors is alright because you increase your workspace, but going from 1080p to 1440p can give you more problems since a lot of UI stuff isn't suited for this resolution yet
Nah, 1440p has fine UI scaling for the vast majority of worthwhile games. It's 4K where things start to get a little fricky.
60fps/hz is not smooth at all. 144hz just feels a lot nicer in everything you do, even desktop and productive stuff. Though beyond 144hz is where you rapidly start hitting diminishing returns.
Can someone explain me this meme of 166hz and bullshit like that? Whats the advantage? 60fps looks fluid enough i just dont get it this is one of the most moronic things i have ever seen...... is it because pros need 200 extra fps to fill their autism? Dude all the good players play on low end pc's and they are good because they learned the game not because some magical fps trickery bullshit..... the 60+ fps thing is not even worth it for singleplayer games too is just ...... extra fps i dont get it.
60 fps seems fluid only if you can ignore the blatant visual tearing in any game with fast moving objects and cameras
I'm not even a gay into competitive fps even though I wanted to get gud I just quickly realized that was gay and went onto playing singleplayer games
personally I just like it better everything is smoother and even scrolling with your internet
browser is just way better
it's just simply better you only need 240hz+ if you're a homosexual or play pro
you'll feel it soon poorgay, I coped too. It mostly peaks at 144hz though, even then, you can't really deny GSync and Freesync advantages too
144hz is legit. Smooth turning on movement just makes every game feel better to play, and by a small way look better.
240hz is a meme though and 360hz kinda is too.
I'm waiting for oled prices to come down.
never had any interface scaling issues myself
at worst I did was install a mod to bring make the UI bigger
Depends on game, i use it for 1440p and generally no issues
No, 3060ti is more of a 1080p card than a 1440p afterall and for older games you can use DLDSR.
don't worry lmao, 3 years later when games starts dropping ps4, your 3060 will beginning to suffer even at 1080p
No
Actually yeah why does memetracing perform so much better on Nvidia cards?
Isn't it their proprietary tech? Only makes sense
actual cores made for raytracing, AMD doesn't have that, just optimization
Because its an Nvidia tech demo feature that they added with the 20 series and included dedicated silicon for on the cards. All AMD could do to add support was make a way for thier cards to handle the same feature without anything specially made to handle it. In another 2 gens they may start having dedicated RT silicon on thier cards if no better method for calculating the raysgets created.
>3060ti
it can run 1080p at best, lol.
3060ti can almost any game at 1440p + 60fps nowadays, even metro exodus you can run at 1440p + all maxed + full RTX at 70~80 fps. But in 2 years it will become a 1080p card obviously.
>But in 2 years it will become a 1080p card obviously.
Yes so this is exactly why 1440p is a trap to make you consume more. You get that "I just can't move back to 1080p" feeling.
Yep. I feel for the trap. GTX 970, then got a 1080... Then a 3070.
Otherwise 1080p would still be fine on the 1080.
Display is the single most important part of a PC build. Are you sure you want to skimp out on the thing you literally stare at the most?
It is also the part which is reliant on all other parts so what you're saying is just dumb okay. If it is so important why didn't you go for 4k? You know it is o b j e c t i v e l y higher quality right? What are you poor? You know you stare at it? Right? hurdur
Diminishing returns.
Are a long way off unless you are actually blind.
>1440p is a trap
ray tracing is the real trap and only useful in old games
Ray tracing isn't something you buy. It's just something that comes up with the current generation cards. Or are you saying that everyone who bought any RT capable card is an idiot?
No, I'm saying anyone using it is a moron
There's no real visual difference unless you are playing something that looked shit to begin with
No, you'll never push your card that hard so it won't over heat or be abused to have shortened lifespan.
>Am I wasting my 3060 Ti by sticking to 1080p?
Yes, next question.
1080p 24" 240hz is perfection.
>24"
I mean if you want to game on a postage stamp sized display then sure...
24" monitor is the ideal size for FPS / competitive gaming though, you can see the entire screen without needing to move your head/eyes.
27" was just too big for me
It's a matter of getting used to it. I thought that too at first but ofc it will feel big if you're used to 24" and have used them exclusively for x amount of years.
Nah I really tried. I dropped $1k on a 1440p display in 2018. Used it for 3 months before selling it. I really tried and wanted to like it but I couldn't.
Perhaps you are right.
Also felt like there was more input lag which I didn't like either. I never got used to that in 3 months.
for me it's the upcoming 23.8" 1440p 144hz+ IPS monitors
27" is too big for fps games and 1080p 24" looks pretty bad after being on 1440p monitor/4k TV for a few years
IMO 24" 1440p is cringe. Too much DPI for the screensize. Additional input lag with no real gain in visibility or visuals and a fked FOV that will have you running at 80 just to see shit.
I bought an ultragear 1080p ips monitor
it is a good monitor. no dead pixels. very bright, very colorful. still under 200 bucks.
Wait, I bought this piece of shit and had to return it because it had colour banding.
haven't noticed that.
If you had it you would notice immediately, either I had shit luck or you had good luck. Or maybe because refunds are so ez in my country that companies sit on returned broken shit and ship it out as new pretending the product isn't faulty.
coulda been the cable or video card. this monitor is really sweet imo.
Yes
3080ti here, 1080p all day, undervolted, liquid cooling, racked in a cooling unit(refrigeration) with ventilation so I get no leaking. Intend to stretch this b***h all the way to 2030. I do not give a frick about modern graphix, I don't buy AAA games.
my advice is to wait for meteor lake cpus. I believe they will be ddr5 only.
I bought a i5 12600k cpu. avoid more powerful intel cpus like the i712700k. they run very hot. they are poorly engineered imo.
1080p is enough
Horseless carriages will never catch on.
No it's actually a gigachad move because more frames + infinite future proof
>3070
>4k144 monitor
Works for me
I dont care about Assassins creed 17 at ultra raytracing settings
Another thread full of tech illiterate monkeys. Just stick to consoles, PC is too hard to understand apparently.
>Do you want to keep the 3060ti for more than 2 years?
Stick with 1080p
>Are you going to sell it and buy a new one in 2 years
get a 1440p since it runs every modern game at 1440p+60fps but it will start struggling in 1~2 years.
i use my 1060 for 1440p so yes
No. In general 1080p and 60fps is more than plenty for just about any game. Going above those values instantly doubles, sometimes outright quadruples the system requirements.
I wanna get a 3060 ti and a g-sync 1440p monitor in the possibly near future when both get cheaper as the new gpu's release
what is a good 1440p monitor to keep in mind?
HP x27q, MSI Optix G273QF, Lenovo G27q-20.
meant for you
>HP
>MSI
>Lenovo
>BRAND
>he's an asus gay or god forbid, sam***g
yeah better than any of that dogshit brand
it goes dell>samsung>asus
rest is dogshit and irrelevant
For me, it's the HP Omen X
Sam***g has god awful QA and you will be stuck in a perpetual waiting for firmware updates.
>2 fans instead of 3
Ya fricked up kid
>quoting him 3 separate times
Ty
The M32Q is one of the rare acceptable 1440p monitors along with the blurbuster certified ones like the Eve series (who are too expensive and hell to order)
https://www.rtings.com/monitor/reviews/gigabyte/m32q
M32Q is incredibly based for the price.
I was looking at the M32Q but ultimately went with the 27Q due to budget. Did I gimp myself?
27Q has higher pixel density as the screen is smaller but with the same amount of pixels so no.
t. using it right now
What's the response times like?
I see a higher than usual GTG but a very low total response. Personally I haven't ever seen a monitor like that, only very low GTG and mediocre total response.
According to rtings it's pretty good
https://www.rtings.com/monitor/reviews/gigabyte/m27q
My issue is I had an older monitor with 3ms GTG but 17ms total response time and it was blurry as frick.
There's also the M27Q X that just launched a recently. Same as the M27Q but with a 240hz refresh rate. Dunno about the price though.
Yes you are. 3060 Ti is more of a 1440p card if you play recent graphics intensive games.
No he isnt, DLDSS exists for a reason, i might not be 100% the same as a native 1440p but its good enough and a 3060ti will struggle to run 1440p games in 1~2 years, so unless he plans to be a consoooomer and change his video card every 2 years, he's fine with a 1080 display.
Possibly, depending on your settings and max refresh rate, but also possibly, the sheer picture and motion quality of that monitor.
I have a 1060 6b and it's hooked to a 4K Oled TV and 1440/144hz ultrawide and I can still manage so I'm gonna say yes.
Is 1440p actually a viable resolution going forward or is it just a stopgap between 1080p and 4K the same way 1440x900 and 1680x1050 were stopgaps between 720p and 1080p?
1440p will be a viable res for a loooong time. 4k is a meme unless you're playing using a big ass screen.
Of course it's viable as long it's 16:9
You should be worried about the motion resolution and performance instead. PC monitors lost it completely compared to TVs, that's the bigger issue.
1440p isn't a stop gap it's just a simple alternative that lets you run games at higher fps for longer while still getting a detail increase
There's no reason to go with 4k until it's a joke for gpus to even render
>There's no reason to go with 4k
As a resolution, yes. But there are other reasons.
You go to 4K TVs for their far superior image quality in SDR and HDR and motion resolution quality. Especially the OLED ones. They absolutely can't be beat.
But yes you'll pay the price in GPU. Although, there's many tricks that works well enough, between DLSS, FSR, simply dealing with lower resolution scaling, or using an ultrawide resolution on them to save even more GPU power.
t. run 4K with a 1060 just fine
>run 4k with a 1060 just fine
Opinion discarded.
It works absolutely fine though. you do realize you're not forced to use native resolution at gunpoint if you have a 4K display, right?
I'm gonna upgrade soon enough. But it works. Adjusting your GPU load is incredibly easy.
Yes it's really good and has an fantastic BFI sync option. Absolutely no motionblur above 120fps VRR.
And btw my favorite games all run at 4K full speed. Not to mention emulation and older games which is most of my gaming anyway.
I know it sounds crazy but if you have even a bit of experience with tools like Afterburner, you can instantly see your GPU utilization and plan accordingly. I have to go under 1440p for the image to start looking like shit. It happens a lot for recent games, of course and there isn't much you can do about it, turning settings down help a lot.
Really I'm just waiting for AM5. Nothing to upgrade to until then, CPU performance can be a bigger bottleneck than GPU bottlenecks.
>4K WITH A 1060
you're a genuine moron
>3060ti
already obsolete.
you'll be at 720p in a year just to hit 240hz.
How am I for 1080p gaming?
>4gb
you're fricked if you decide to play anything newer than 2015
Source games might be your shtick
apex legends is source and that gpu will run it just at 1080/60
i dont consider Apex Legends a game
You can still play some new AAA games on low settings. Otherwise, fricking burn this shit and buy a new one, anon.
Finally got my hands on a 3060 Ti!
If you're playing on 60 frames only, then fricking yes. 144 frames is a minimum for this GPU if you're still rocking with 1080p.
>am I wasting money by not spending money
AAAAAAAAAAHHHH WHY IS THIS WORLD LIKE THIS JUST PLAY THE FRICKING GAME AND EAT AND SLEEP AND SOMETHING
No, enjoy DLAA bro
t.3070ti 1080p(still playing some shit at 900p because big screen is bloat)
I'm also 3070ti 1080p and can't lose the feeling that I should have gotten 3060ti instead... what games do you play? Apex and Doom are the games I get most out of the card but I keep playing so many older titles I feel like an idiot.
VR
Some singleplayer titles(including RDR2, almost finished it, played pathologic 2 before at 900p@37 fps because it's really shitty optimized)
ASShomosexualS(playing too much TFT lol)
Nah
But you get go up to 1440p without a big performance loss unlike 4K
Who the frick playing vidya on 4K ? It's like Ray tracing, a fricking joke.
Yeah I don't see esports gays use that res. They stick with 1080p but at 360hz or something
I am interested in 240 hz but the fact that some games will just get capped by the CPU is making me not wanna take the plunge..
If you use any cpu made in the last 3 years, other than intel office work pentium 2 core shit, then that will be enough for esports high refresh gaming.
No it wont. For example I played few Steam demos during last demo thingy and some of them would dip to 100. And then there's games like Vermintide, Dota 2.. Even Team Fortress 2 runs like ass because the code is what it is...
>Have 60 inch 4k tv
>Compare games at different resolutions, 1080p, 1440p, 4k
>Only noticeable difference is the lower performance
Lol
Is it worth it to upgrade the entire kit and caboodle now or just wait? We're on a cusp of a generation jump but pricing is actually looking somewhat reasonable for most components unless you're into the DDR5 shitshow.
Well do you feel like you need to upgrade right now? That's always the question to ask yourself. If you do, then go for it. Otherwise you'll get stuck in the perpetual "just wait bro" cycle.
I got the itch since its been 6 years. I got a 1080 still trucking and can play everything I want to though. My excuse is that I recently lucked out and got a super ultra wide meme monitor for free and I want to actually get higher framerates at full rez.
>tfw zotac mini 1060 6gb
For now yes, but give it a year or two and you'll be just fine with 1080p.
>xx80 card on release
1440p/144hz
>xx80 card 2 years after release
1440p/60+ hz
>xx80 card 4 years after release
struggling with 1080p
and you can subtract 1 year for every class down, 70, 60, 50 etc.
Is 4k still a meme?
Yes considering upscaling is required for decent performance
Will we not be able to do 4k finally with the new gpus coming out in a few months?
with the new gpus probably but I'll be sticking to my 1440p
>what's supersampling
>what's downsampling
>what are reshades
Get your mileage, Black.
Why not buy some clothes or accessories so you can finally lose your virginity instead of buying graphics cards so you can have a higher resolution that you never notice anyway?
More of theses threads I see the more I realize that Ganker really has no clue how computers work
Almost no one does not even game devs themselves do
Do you guys use this things? Do I need to buy one
UPSs are unnecessary unless you have something mission critical that requires a clean shutdown.
i don't
RTX 270 super for $390
Would you do it Gankerirgins?
no
That's a 15 year old GPU anon.
Is the 3070 Ti a waste for 1440p 144hz?
anything faster than a 3070 uses gddr6x memory, which runs very hot and will probably brick your gpu right after the warranty ends.
i got the 3070 because it uses the cooler running gddr6 ram.
Guess I am buying a 4060 TI in the future then
Just make sure it’s 144hz
Will a 750 watt PSU be enough for a 4080?
Maybe enough to turn it on. Probably not enough to actually game on if you don't undervolt it.
A fricking 3080 is probably too much for a 750. That's why I went with a 3070
no it's not. I run a 5800x and 3080 both undervolted total power under full load is around 450-500w. At stock it's 600w.
Worth upgrading from a faulty RX 5700 XT to this? AMD will never have a good video encoder so I want to be blessed by the NVENC gods in the near future.
If you're okay with buying that upgrade on the edge of a new release then sure. I assume you're having actual crashing driver issues on yours.
I did everything, from DDU, to cleanup utility, to daisychaining, to upgrading psu, case and dual channel RAM and nothing seems to help. It’s fricking tragic if video editing is your side hustle.
I would definitely say go ahead with the switch over then. If you're making money on this then push for a 3080 at least though unless your psu can't can't handle that.
>tfw my system has more vram than system ram + swap
>play on 4K TV
>1080p (integer scaled) looks good
>anything between 1080p and 4K (lanczos + sharpening) looks good
>4K looks good
>upscaling methods actually work well enough
>people keep calling it cope
I'm glad not being tech illiterate
what do u mean by this?
I have a 3070 ti and 48" LG C2
There's a justified belief that non-native upscaled resolutions look like ass on LCD due to upscaling blur, but at 4K, resolution above 1080p already look pretty well in my opinion. You can negate the upscaling blur by using lanczos upscaling (hidden Nvidia Control Panel setting, it results with slightly sharper image) instead of bilinear upscaling (the default one). Then you can apply some sharpening, using Reshade for example, to further make the image more clear.
FSR 1.0 does similar thing, but natively and better.
And integer scaling makes it so there's no upscaling blur, every pixel is multiplied and has the color it should have. It's ideal for 1080p to 4K, image looks exactly like it would on a native 1080p screen.
>Realizing I have a 4k monitor for nothing besides a few PS5 games (running at 30 or 40 fps)and will probably run my pc and series s and all my other ps5 games on my 1080p monitor whenever I get back home