Am I wasting my 3060 Ti by sticking to 1080p?

Am I wasting my 3060 Ti by sticking to 1080p?

Thalidomide Vintage Ad Shirt $22.14

Homeless People Are Sexy Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 years ago
    Anonymous

    No, because you can DSR at higher resolutions, and get the mother of all AA. For old games, it's gold.

  2. 2 years ago
    Anonymous

    No because 1440p is a meme

    • 2 years ago
      Anonymous

      nice cope poorgay

  3. 2 years ago
    Anonymous

    >1080p
    It's not 2011 anymore

    • 2 years ago
      Anonymous

      >It's not 2011 anymore
      Unfortunately.

    • 2 years ago
      Anonymous

      And that's why everything sucks.

      • 2 years ago
        Anonymous

        2011 was equally shit zoomzoom. Bring me back to 2007 before the recession and before PC gaming became nothing but console ports.

        • 2 years ago
          Anonymous

          iPhone and social media had catastrophic effects for not just video games but the society as a whole.

          • 2 years ago
            Anonymous

            Recession also heavily crippled the PC gaming industry hence why very few games were being made exclusively for the PC after 2008. Console ports with high res packs don't count

  4. 2 years ago
    Anonymous

    You should take the 1440p pill. The first thought when I made the jump from my 1080p monitor to a 1440p was "wow, I wish I would have switched sooner". Plus now is a really good time to make the upgrade. You can get a solid 1440p 144/165hz monitor for like 200-300 bucks, where as just a few years ago those specs were in the 500 bucks ballpark.

    • 2 years ago
      Anonymous

      >"wow, I wish I would have switched sooner"
      literally explains nothing, switching to two monitors is alright because you increase your workspace, but going from 1080p to 1440p can give you more problems since a lot of UI stuff isn't suited for this resolution yet

      • 2 years ago
        Anonymous

        Nah, 1440p has fine UI scaling for the vast majority of worthwhile games. It's 4K where things start to get a little fricky.

        Can someone explain me this meme of 166hz and bullshit like that? Whats the advantage? 60fps looks fluid enough i just dont get it this is one of the most moronic things i have ever seen...... is it because pros need 200 extra fps to fill their autism? Dude all the good players play on low end pc's and they are good because they learned the game not because some magical fps trickery bullshit..... the 60+ fps thing is not even worth it for singleplayer games too is just ...... extra fps i dont get it.

        60fps/hz is not smooth at all. 144hz just feels a lot nicer in everything you do, even desktop and productive stuff. Though beyond 144hz is where you rapidly start hitting diminishing returns.

    • 2 years ago
      Anonymous

      Can someone explain me this meme of 166hz and bullshit like that? Whats the advantage? 60fps looks fluid enough i just dont get it this is one of the most moronic things i have ever seen...... is it because pros need 200 extra fps to fill their autism? Dude all the good players play on low end pc's and they are good because they learned the game not because some magical fps trickery bullshit..... the 60+ fps thing is not even worth it for singleplayer games too is just ...... extra fps i dont get it.

      • 2 years ago
        Anonymous

        60 fps seems fluid only if you can ignore the blatant visual tearing in any game with fast moving objects and cameras

        I'm not even a gay into competitive fps even though I wanted to get gud I just quickly realized that was gay and went onto playing singleplayer games
        personally I just like it better everything is smoother and even scrolling with your internet
        browser is just way better

        it's just simply better you only need 240hz+ if you're a homosexual or play pro

      • 2 years ago
        Anonymous

        you'll feel it soon poorgay, I coped too. It mostly peaks at 144hz though, even then, you can't really deny GSync and Freesync advantages too

      • 2 years ago
        Anonymous

        144hz is legit. Smooth turning on movement just makes every game feel better to play, and by a small way look better.
        240hz is a meme though and 360hz kinda is too.

    • 2 years ago
      Anonymous

      I'm waiting for oled prices to come down.

  5. 2 years ago
    Anonymous

    [...]

    never had any interface scaling issues myself
    at worst I did was install a mod to bring make the UI bigger

  6. 2 years ago
    Anonymous

    Depends on game, i use it for 1440p and generally no issues

  7. 2 years ago
    Anonymous

    No, 3060ti is more of a 1080p card than a 1440p afterall and for older games you can use DLDSR.

  8. 2 years ago
    Anonymous

    don't worry lmao, 3 years later when games starts dropping ps4, your 3060 will beginning to suffer even at 1080p

  9. 2 years ago
    Anonymous

    No

    • 2 years ago
      Anonymous

      Actually yeah why does memetracing perform so much better on Nvidia cards?

      • 2 years ago
        Anonymous

        Isn't it their proprietary tech? Only makes sense

      • 2 years ago
        Anonymous

        actual cores made for raytracing, AMD doesn't have that, just optimization

      • 2 years ago
        Anonymous

        Because its an Nvidia tech demo feature that they added with the 20 series and included dedicated silicon for on the cards. All AMD could do to add support was make a way for thier cards to handle the same feature without anything specially made to handle it. In another 2 gens they may start having dedicated RT silicon on thier cards if no better method for calculating the raysgets created.

  10. 2 years ago
    Anonymous

    >3060ti
    it can run 1080p at best, lol.

    • 2 years ago
      Anonymous

      3060ti can almost any game at 1440p + 60fps nowadays, even metro exodus you can run at 1440p + all maxed + full RTX at 70~80 fps. But in 2 years it will become a 1080p card obviously.

      • 2 years ago
        Anonymous

        >But in 2 years it will become a 1080p card obviously.
        Yes so this is exactly why 1440p is a trap to make you consume more. You get that "I just can't move back to 1080p" feeling.

        • 2 years ago
          Anonymous

          Yep. I feel for the trap. GTX 970, then got a 1080... Then a 3070.
          Otherwise 1080p would still be fine on the 1080.

        • 2 years ago
          Anonymous

          Display is the single most important part of a PC build. Are you sure you want to skimp out on the thing you literally stare at the most?

          • 2 years ago
            Anonymous

            It is also the part which is reliant on all other parts so what you're saying is just dumb okay. If it is so important why didn't you go for 4k? You know it is o b j e c t i v e l y higher quality right? What are you poor? You know you stare at it? Right? hurdur

            • 2 years ago
              Anonymous

              Diminishing returns.

              • 2 years ago
                Anonymous

                Are a long way off unless you are actually blind.

        • 2 years ago
          Anonymous

          >1440p is a trap
          ray tracing is the real trap and only useful in old games

          • 2 years ago
            Anonymous

            Ray tracing isn't something you buy. It's just something that comes up with the current generation cards. Or are you saying that everyone who bought any RT capable card is an idiot?

            • 2 years ago
              Anonymous

              No, I'm saying anyone using it is a moron
              There's no real visual difference unless you are playing something that looked shit to begin with

  11. 2 years ago
    Anonymous

    No, you'll never push your card that hard so it won't over heat or be abused to have shortened lifespan.

  12. 2 years ago
    Anonymous

    >Am I wasting my 3060 Ti by sticking to 1080p?
    Yes, next question.

  13. 2 years ago
    Anonymous

    1080p 24" 240hz is perfection.

    • 2 years ago
      Anonymous

      >24"
      I mean if you want to game on a postage stamp sized display then sure...

      • 2 years ago
        Anonymous

        24" monitor is the ideal size for FPS / competitive gaming though, you can see the entire screen without needing to move your head/eyes.

      • 2 years ago
        Anonymous

        27" was just too big for me

        • 2 years ago
          Anonymous

          It's a matter of getting used to it. I thought that too at first but ofc it will feel big if you're used to 24" and have used them exclusively for x amount of years.

          • 2 years ago
            Anonymous

            Nah I really tried. I dropped $1k on a 1440p display in 2018. Used it for 3 months before selling it. I really tried and wanted to like it but I couldn't.
            Perhaps you are right.
            Also felt like there was more input lag which I didn't like either. I never got used to that in 3 months.

    • 2 years ago
      Anonymous

      for me it's the upcoming 23.8" 1440p 144hz+ IPS monitors
      27" is too big for fps games and 1080p 24" looks pretty bad after being on 1440p monitor/4k TV for a few years

      • 2 years ago
        Anonymous
      • 2 years ago
        Anonymous

        IMO 24" 1440p is cringe. Too much DPI for the screensize. Additional input lag with no real gain in visibility or visuals and a fked FOV that will have you running at 80 just to see shit.

  14. 2 years ago
    Anonymous

    I bought an ultragear 1080p ips monitor
    it is a good monitor. no dead pixels. very bright, very colorful. still under 200 bucks.

    • 2 years ago
      Anonymous

      Wait, I bought this piece of shit and had to return it because it had colour banding.

      • 2 years ago
        Anonymous

        haven't noticed that.

        • 2 years ago
          Anonymous

          If you had it you would notice immediately, either I had shit luck or you had good luck. Or maybe because refunds are so ez in my country that companies sit on returned broken shit and ship it out as new pretending the product isn't faulty.

          • 2 years ago
            Anonymous

            coulda been the cable or video card. this monitor is really sweet imo.

  15. 2 years ago
    Anonymous

    Yes

  16. 2 years ago
    Anonymous

    3080ti here, 1080p all day, undervolted, liquid cooling, racked in a cooling unit(refrigeration) with ventilation so I get no leaking. Intend to stretch this b***h all the way to 2030. I do not give a frick about modern graphix, I don't buy AAA games.

  17. 2 years ago
    Anonymous

    my advice is to wait for meteor lake cpus. I believe they will be ddr5 only.
    I bought a i5 12600k cpu. avoid more powerful intel cpus like the i712700k. they run very hot. they are poorly engineered imo.

  18. 2 years ago
    Anonymous

    1080p is enough

    • 2 years ago
      Anonymous

      Horseless carriages will never catch on.

  19. 2 years ago
    Anonymous

    No it's actually a gigachad move because more frames + infinite future proof

  20. 2 years ago
    Anonymous

    >3070
    >4k144 monitor
    Works for me
    I dont care about Assassins creed 17 at ultra raytracing settings

  21. 2 years ago
    Anonymous

    Another thread full of tech illiterate monkeys. Just stick to consoles, PC is too hard to understand apparently.

  22. 2 years ago
    Anonymous

    >Do you want to keep the 3060ti for more than 2 years?

    Stick with 1080p

    >Are you going to sell it and buy a new one in 2 years

    get a 1440p since it runs every modern game at 1440p+60fps but it will start struggling in 1~2 years.

  23. 2 years ago
    Anonymous

    i use my 1060 for 1440p so yes

  24. 2 years ago
    Anonymous

    No. In general 1080p and 60fps is more than plenty for just about any game. Going above those values instantly doubles, sometimes outright quadruples the system requirements.

  25. 2 years ago
    Anonymous

    I wanna get a 3060 ti and a g-sync 1440p monitor in the possibly near future when both get cheaper as the new gpu's release
    what is a good 1440p monitor to keep in mind?

    • 2 years ago
      Anonymous

      HP x27q, MSI Optix G273QF, Lenovo G27q-20.

      • 2 years ago
        Anonymous

        Ty

        meant for you

      • 2 years ago
        Anonymous

        >HP

        >MSI

        >Lenovo

        • 2 years ago
          Anonymous

          >BRAND

        • 2 years ago
          Anonymous

          >he's an asus gay or god forbid, sam***g

          • 2 years ago
            Anonymous

            yeah better than any of that dogshit brand
            it goes dell>samsung>asus
            rest is dogshit and irrelevant

            • 2 years ago
              Anonymous

              For me, it's the HP Omen X

            • 2 years ago
              Anonymous

              Sam***g has god awful QA and you will be stuck in a perpetual waiting for firmware updates.

        • 2 years ago
          Anonymous

          >2 fans instead of 3
          Ya fricked up kid

          >quoting him 3 separate times

    • 2 years ago
      Anonymous

      Ty

    • 2 years ago
      Anonymous

      The M32Q is one of the rare acceptable 1440p monitors along with the blurbuster certified ones like the Eve series (who are too expensive and hell to order)
      https://www.rtings.com/monitor/reviews/gigabyte/m32q

      • 2 years ago
        Anonymous

        M32Q is incredibly based for the price.

      • 2 years ago
        Anonymous

        M32Q is incredibly based for the price.

        I was looking at the M32Q but ultimately went with the 27Q due to budget. Did I gimp myself?

        • 2 years ago
          Anonymous

          27Q has higher pixel density as the screen is smaller but with the same amount of pixels so no.

          t. using it right now

          • 2 years ago
            Anonymous

            What's the response times like?
            I see a higher than usual GTG but a very low total response. Personally I haven't ever seen a monitor like that, only very low GTG and mediocre total response.

            • 2 years ago
              Anonymous

              According to rtings it's pretty good
              https://www.rtings.com/monitor/reviews/gigabyte/m27q

              • 2 years ago
                Anonymous

                My issue is I had an older monitor with 3ms GTG but 17ms total response time and it was blurry as frick.

            • 2 years ago
              Anonymous

              There's also the M27Q X that just launched a recently. Same as the M27Q but with a 240hz refresh rate. Dunno about the price though.

  26. 2 years ago
    Anonymous

    Yes you are. 3060 Ti is more of a 1440p card if you play recent graphics intensive games.

    • 2 years ago
      Anonymous

      No he isnt, DLDSS exists for a reason, i might not be 100% the same as a native 1440p but its good enough and a 3060ti will struggle to run 1440p games in 1~2 years, so unless he plans to be a consoooomer and change his video card every 2 years, he's fine with a 1080 display.

  27. 2 years ago
    Anonymous

    Possibly, depending on your settings and max refresh rate, but also possibly, the sheer picture and motion quality of that monitor.

    I have a 1060 6b and it's hooked to a 4K Oled TV and 1440/144hz ultrawide and I can still manage so I'm gonna say yes.

  28. 2 years ago
    Anonymous

    Is 1440p actually a viable resolution going forward or is it just a stopgap between 1080p and 4K the same way 1440x900 and 1680x1050 were stopgaps between 720p and 1080p?

    • 2 years ago
      Anonymous

      1440p will be a viable res for a loooong time. 4k is a meme unless you're playing using a big ass screen.

    • 2 years ago
      Anonymous

      Of course it's viable as long it's 16:9
      You should be worried about the motion resolution and performance instead. PC monitors lost it completely compared to TVs, that's the bigger issue.

    • 2 years ago
      Anonymous

      1440p isn't a stop gap it's just a simple alternative that lets you run games at higher fps for longer while still getting a detail increase
      There's no reason to go with 4k until it's a joke for gpus to even render

      • 2 years ago
        Anonymous

        >There's no reason to go with 4k
        As a resolution, yes. But there are other reasons.
        You go to 4K TVs for their far superior image quality in SDR and HDR and motion resolution quality. Especially the OLED ones. They absolutely can't be beat.
        But yes you'll pay the price in GPU. Although, there's many tricks that works well enough, between DLSS, FSR, simply dealing with lower resolution scaling, or using an ultrawide resolution on them to save even more GPU power.
        t. run 4K with a 1060 just fine

        • 2 years ago
          Anonymous

          >run 4k with a 1060 just fine

          Opinion discarded.

          • 2 years ago
            Anonymous

            >4K WITH A 1060
            you're a genuine moron

            It works absolutely fine though. you do realize you're not forced to use native resolution at gunpoint if you have a 4K display, right?
            I'm gonna upgrade soon enough. But it works. Adjusting your GPU load is incredibly easy.

            M32Q is incredibly based for the price.

            Yes it's really good and has an fantastic BFI sync option. Absolutely no motionblur above 120fps VRR.

            • 2 years ago
              Anonymous

              And btw my favorite games all run at 4K full speed. Not to mention emulation and older games which is most of my gaming anyway.
              I know it sounds crazy but if you have even a bit of experience with tools like Afterburner, you can instantly see your GPU utilization and plan accordingly. I have to go under 1440p for the image to start looking like shit. It happens a lot for recent games, of course and there isn't much you can do about it, turning settings down help a lot.
              Really I'm just waiting for AM5. Nothing to upgrade to until then, CPU performance can be a bigger bottleneck than GPU bottlenecks.

        • 2 years ago
          Anonymous

          >4K WITH A 1060
          you're a genuine moron

  29. 2 years ago
    Anonymous

    >3060ti
    already obsolete.
    you'll be at 720p in a year just to hit 240hz.

  30. 2 years ago
    Anonymous

    How am I for 1080p gaming?

    • 2 years ago
      Anonymous

      >4gb
      you're fricked if you decide to play anything newer than 2015

    • 2 years ago
      Anonymous

      Source games might be your shtick

      • 2 years ago
        Anonymous

        apex legends is source and that gpu will run it just at 1080/60

        • 2 years ago
          Anonymous

          i dont consider Apex Legends a game

    • 2 years ago
      Anonymous

      You can still play some new AAA games on low settings. Otherwise, fricking burn this shit and buy a new one, anon.

  31. 2 years ago
    Anonymous

    Finally got my hands on a 3060 Ti!

  32. 2 years ago
    Anonymous

    If you're playing on 60 frames only, then fricking yes. 144 frames is a minimum for this GPU if you're still rocking with 1080p.

  33. 2 years ago
    Anonymous

    >am I wasting money by not spending money
    AAAAAAAAAAHHHH WHY IS THIS WORLD LIKE THIS JUST PLAY THE FRICKING GAME AND EAT AND SLEEP AND SOMETHING

  34. 2 years ago
    Anonymous

    No, enjoy DLAA bro
    t.3070ti 1080p(still playing some shit at 900p because big screen is bloat)

    • 2 years ago
      Anonymous

      I'm also 3070ti 1080p and can't lose the feeling that I should have gotten 3060ti instead... what games do you play? Apex and Doom are the games I get most out of the card but I keep playing so many older titles I feel like an idiot.

      • 2 years ago
        Anonymous

        VR
        Some singleplayer titles(including RDR2, almost finished it, played pathologic 2 before at 900p@37 fps because it's really shitty optimized)
        ASShomosexualS(playing too much TFT lol)

  35. 2 years ago
    Anonymous

    Nah

    But you get go up to 1440p without a big performance loss unlike 4K

    • 2 years ago
      Anonymous

      Who the frick playing vidya on 4K ? It's like Ray tracing, a fricking joke.

      • 2 years ago
        Anonymous

        Yeah I don't see esports gays use that res. They stick with 1080p but at 360hz or something

        • 2 years ago
          Anonymous

          I am interested in 240 hz but the fact that some games will just get capped by the CPU is making me not wanna take the plunge..

          • 2 years ago
            Anonymous

            If you use any cpu made in the last 3 years, other than intel office work pentium 2 core shit, then that will be enough for esports high refresh gaming.

            • 2 years ago
              Anonymous

              No it wont. For example I played few Steam demos during last demo thingy and some of them would dip to 100. And then there's games like Vermintide, Dota 2.. Even Team Fortress 2 runs like ass because the code is what it is...

  36. 2 years ago
    Anonymous

    >Have 60 inch 4k tv
    >Compare games at different resolutions, 1080p, 1440p, 4k
    >Only noticeable difference is the lower performance
    Lol

  37. 2 years ago
    Anonymous

    Is it worth it to upgrade the entire kit and caboodle now or just wait? We're on a cusp of a generation jump but pricing is actually looking somewhat reasonable for most components unless you're into the DDR5 shitshow.

    • 2 years ago
      Anonymous

      Well do you feel like you need to upgrade right now? That's always the question to ask yourself. If you do, then go for it. Otherwise you'll get stuck in the perpetual "just wait bro" cycle.

      • 2 years ago
        Anonymous

        I got the itch since its been 6 years. I got a 1080 still trucking and can play everything I want to though. My excuse is that I recently lucked out and got a super ultra wide meme monitor for free and I want to actually get higher framerates at full rez.

  38. 2 years ago
    Anonymous

    >tfw zotac mini 1060 6gb

  39. 2 years ago
    Anonymous

    For now yes, but give it a year or two and you'll be just fine with 1080p.
    >xx80 card on release
    1440p/144hz
    >xx80 card 2 years after release
    1440p/60+ hz
    >xx80 card 4 years after release
    struggling with 1080p

    and you can subtract 1 year for every class down, 70, 60, 50 etc.

  40. 2 years ago
    Anonymous

    Is 4k still a meme?

    • 2 years ago
      Anonymous

      Yes considering upscaling is required for decent performance

      • 2 years ago
        Anonymous

        Will we not be able to do 4k finally with the new gpus coming out in a few months?

        • 2 years ago
          Anonymous

          with the new gpus probably but I'll be sticking to my 1440p

  41. 2 years ago
    Anonymous

    >what's supersampling
    >what's downsampling
    >what are reshades
    Get your mileage, Black.

  42. 2 years ago
    Anonymous

    Why not buy some clothes or accessories so you can finally lose your virginity instead of buying graphics cards so you can have a higher resolution that you never notice anyway?

  43. 2 years ago
    Anonymous

    More of theses threads I see the more I realize that Ganker really has no clue how computers work

    • 2 years ago
      Anonymous

      Almost no one does not even game devs themselves do

  44. 2 years ago
    Anonymous

    Do you guys use this things? Do I need to buy one

    • 2 years ago
      Anonymous

      UPSs are unnecessary unless you have something mission critical that requires a clean shutdown.

    • 2 years ago
      Anonymous

      i don't

  45. 2 years ago
    Anonymous

    RTX 270 super for $390
    Would you do it Gankerirgins?

    • 2 years ago
      Anonymous

      no

    • 2 years ago
      Anonymous

      That's a 15 year old GPU anon.

  46. 2 years ago
    Anonymous

    Is the 3070 Ti a waste for 1440p 144hz?

    • 2 years ago
      Anonymous

      anything faster than a 3070 uses gddr6x memory, which runs very hot and will probably brick your gpu right after the warranty ends.
      i got the 3070 because it uses the cooler running gddr6 ram.

      • 2 years ago
        Anonymous

        Guess I am buying a 4060 TI in the future then

  47. 2 years ago
    Anonymous

    Just make sure it’s 144hz

  48. 2 years ago
    Anonymous

    Will a 750 watt PSU be enough for a 4080?

    • 2 years ago
      Anonymous

      Maybe enough to turn it on. Probably not enough to actually game on if you don't undervolt it.

    • 2 years ago
      Anonymous

      A fricking 3080 is probably too much for a 750. That's why I went with a 3070

      • 2 years ago
        Anonymous

        no it's not. I run a 5800x and 3080 both undervolted total power under full load is around 450-500w. At stock it's 600w.

  49. 2 years ago
    Anonymous

    Worth upgrading from a faulty RX 5700 XT to this? AMD will never have a good video encoder so I want to be blessed by the NVENC gods in the near future.

    • 2 years ago
      Anonymous

      If you're okay with buying that upgrade on the edge of a new release then sure. I assume you're having actual crashing driver issues on yours.

      • 2 years ago
        Anonymous

        I did everything, from DDU, to cleanup utility, to daisychaining, to upgrading psu, case and dual channel RAM and nothing seems to help. It’s fricking tragic if video editing is your side hustle.

        • 2 years ago
          Anonymous

          I would definitely say go ahead with the switch over then. If you're making money on this then push for a 3080 at least though unless your psu can't can't handle that.

  50. 2 years ago
    Anonymous

    >tfw my system has more vram than system ram + swap

  51. 2 years ago
    Anonymous

    >play on 4K TV
    >1080p (integer scaled) looks good
    >anything between 1080p and 4K (lanczos + sharpening) looks good
    >4K looks good
    >upscaling methods actually work well enough
    >people keep calling it cope
    I'm glad not being tech illiterate

    • 2 years ago
      Anonymous

      what do u mean by this?
      I have a 3070 ti and 48" LG C2

      • 2 years ago
        Anonymous

        There's a justified belief that non-native upscaled resolutions look like ass on LCD due to upscaling blur, but at 4K, resolution above 1080p already look pretty well in my opinion. You can negate the upscaling blur by using lanczos upscaling (hidden Nvidia Control Panel setting, it results with slightly sharper image) instead of bilinear upscaling (the default one). Then you can apply some sharpening, using Reshade for example, to further make the image more clear.
        FSR 1.0 does similar thing, but natively and better.

        And integer scaling makes it so there's no upscaling blur, every pixel is multiplied and has the color it should have. It's ideal for 1080p to 4K, image looks exactly like it would on a native 1080p screen.

  52. 2 years ago
    Anonymous

    >Realizing I have a 4k monitor for nothing besides a few PS5 games (running at 30 or 40 fps)and will probably run my pc and series s and all my other ps5 games on my 1080p monitor whenever I get back home

Your email address will not be published. Required fields are marked *