Performance costs of higher resolutions

7800X3D/4090:

1080p
>AVG: 266.3
>1% LOW: 189.4
Δ = 29%

1440p
>AVG: 243.3
>1% LOW: 172.3
Δ = 29%

2160p
>AVG: 169.9
>1% LOW: 124.9
Δ = 27%

% FPS Losses as resolution is increased
>1080p to 1440p = 9% AVG, 9% LOW
>1440p to 2160p = 30% AVG, 28% LOW
>1080p to 2160p = 36% AVG, 34% LOW
------

13900K/4090:

1080p
>AVG: 243.2
>1% LOW: 183.2
Δ = 25%

1440p
>AVG: 208.7
>1% LOW: 162.4
Δ = 22%

2160p
>AVG: 133
>1% LOW: 111
Δ = 17%

% FPS loss as resolution increases
>1080p to 1440p = 14% AVG, 11% LOW
>1440p to 2160p = 26% AVG, 32% LOW
>1080p to 2160p = 45% AVG, 40% LOW
------

13900K/6800XT:

1080p
>AVG: 152.6
>1% LOW: 123.3
Δ = 20%

1440p
>AVG: 116.7
>1% LOW: 96.8
Δ = 17%

2160p
>AVG: 68.2
>1% LOW: 57.7
Δ = 15%

% FPS loss as resolution increases
>1080p to 1440p = 24% AVG, 22% LOW
>1440p to 2160p = 42% AVG, 40% LOW
>1080p to 2160p = 55% AVG, 54% LOW

Mike Stoklasa's Worst Fan Shirt $21.68

Ape Out Shirt $21.68

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 7 months ago
    Anonymous

    As you can see, with the 7800X3D/4090 system, the losses are minimized, while still being quite high going to 2160p.

    With the lower end GPU that people can actually afford, like the 6800XT in this instance, the losses actually grow, to the point that even going from 1080p to 1440p is a significant cost to performance, and going to 2160p is egregious.

    We also see the delta between the average and minimum decrease as performances decreases, for whatever that's worth.

    • 7 months ago
      Anonymous

      what about 2560x1080? more pixels than 1080p, less demanding than 1440p. still hits high fps.

      • 7 months ago
        Anonymous

        ultrawide is a niche

        • 7 months ago
          Anonymous

          Its not? Every movie is 21:9 or some ultrawides ratio. Games since Chronicles of Riddick in 2004 supported it.

          • 7 months ago
            Anonymous

            moron

      • 7 months ago
        Anonymous

        You get over 200fps

    • 7 months ago
      Anonymous

      >With the lower end GPU that people can actually afford
      >the losses actually grow
      >going from 1080p to 1440p is a significant cost to performance
      then don't go above 1080p
      >inb4 still using 1080p
      if the argument about performance costs comes down to what kind of GPU people can actually afford, then decisions should be made around it and what kind of performance it can deliver
      putting it into practice, if the GPU i can afford can only give me 1080p 60fps and going above that there's noticeable losses in performance, then i'd rather buy a good 1080p 60hz monitor and enjoy it for what it is rather than buying a higher res and/or higher refresh monitor and then being constantly mad i can't run games at my native monitor res at higher fps.
      If a 1080p 60fps capable GPU is all i can afford, then it makes no sense to spend more money on a higher res/refresh monitor, and when and if i'll be able to afford a better GPU, then i'll also be able to afford a more fitting monitor for it.

      • 7 months ago
        Anonymous

        What does it mean "to be able to afford"? You're not capable of saving a bit more? When do you decide to stop?

        • 7 months ago
          Anonymous

          you could also try and not ignore all the IFs i used
          i am following OP's argument of "what people can actually afford", im not saying i personally can't afford it.
          so, IF we have to consider what people can afford, which in OP's argument it's a 1080p 60fps card, THEN i apply the reasoning i explained

          still, the argument doesn't change: buy your monitor according to what performance your PC can deliver depending on what parts you can afford.
          if you can only afford a 1080p 60fps PC = buy a 1080p 60hz monitor
          if you can afford a higher res higher fps PC = buy a higher res higher refresh monitor

          • 7 months ago
            Anonymous

            I'm not arguing anything. I'm just showing the difference between the best (4090) and something that more people could compare to (6800XT).

  2. 7 months ago
    Anonymous

    I don't see the point of using 4K for a computer monitor. Unless you start going beyond 32 inches you're not even going to be able to make out individual pixels on a 1440p display at these sizes.

    • 7 months ago
      Anonymous

      this is the reason people will cite

      the left is 27" 1080p, and the right is 27" 2160p with 200% scaling

      • 7 months ago
        Anonymous

        This is the reason mentally ill apple freaks will cite, you mean. Both images are perfectly legible, imagine tanking your framerates for slightly smoother text.

      • 7 months ago
        Anonymous

        I'm a gay that can see the diff between 144hz and 240hz and these two images look the same to me.

        • 7 months ago
          Anonymous

          The difference between 144hz and 240hz is blatant.

      • 7 months ago
        Anonymous

        Pixel door of a 27 inch 1440p monitor is like at 20cm from your eyes.
        You dont ever want to be that close to the monitor.

        • 7 months ago
          Anonymous

          you need to be 30 inches away from a 27" 1440p monitor to reach 60PPD, which is basically the minimum for a good experience.

          http://phrogz.net/tmp/ScreenDensityCalculator.html#find:density,pxW:2560,pxH:1440,size:27,sizeUnit:in,axis:diag,distance:31,distUnit:in

          • 7 months ago
            Anonymous

            Thats around 75cm.
            Whatever your desk or chair setup is you will be in the that distance range of a metre unless you really try to lean in forward and get your head too close.

    • 7 months ago
      Anonymous

      >I don't see the point of using 4K for a computer monitor.
      >Unless you start going beyond 32 inches
      You answered your own question?

      4k is great at 40" and up. Below that, sure, you're a moron.

      • 7 months ago
        Anonymous

        Yeah but I can count on one hand the number of computer monitors I've seen that are larger than 27 inches.

        • 7 months ago
          Anonymous

          They're all just people buying the LG OLED televisions and using them as computer monitors.

    • 7 months ago
      Anonymous

      you are supposed to keep buying new things, stupid goy

    • 7 months ago
      Anonymous

      >you're not even going to be able to make out individual pixels
      but that's kinda why i want one

    • 7 months ago
      Anonymous

      I remember in the early Gmod days back when I was in school, my friends and I all came to the conclusion that resolution isnt nearly as important as framerate and PPI, so in about 10-15 years console Black folk will probably finally realize this too.

    • 7 months ago
      Anonymous

      jumped to UHD from FHD last year and kinda regret it. it's a godsend for multitasking. having 3 (or more) applications visible at once pleases my ADHD mind. movies look great obviously. gaming is impossible, i have to play on my old 1080p monitor which i have off to the side. imo go ultrawide or dual QHD if you're an ADHD brained gamer and multitasker

    • 7 months ago
      Anonymous

      4K monitors are insane for work.
      I once opened some satellite photos in full screen on a 4k monitor and felt like crying with emotion. The quality is so high that it surpasses basically any printed material, even things made for professional use or artistic books that try to be beautiful, they don't even come close to the quality of the 4K image.
      But... for games? There's a huge problem of performance here, and I think that in today's reality, 1440p is still the most sensible option in most cases, unless you're rich and also a bit crazy to spend a lot of money when recent AAAs aren't even worth watching on YouTube, let alone playing.

      • 7 months ago
        Anonymous

        >. The quality is so high that it surpasses basically any printed material
        It literally does. 27" 4K is 163 PPI. Printed material is 300DPI, which translates to below 150PPI.

        • 7 months ago
          Anonymous

          and then you're sitting further away from a 27" monitor compared to a US Letter or A4 page, so the Pixels Per Degree is actually significantly higher for a 4K 27" monitor.

  3. 7 months ago
    Anonymous

    I bought a 60 hz 4K 32" monitor with bulit in speakers from Newegg for around $400-$500 in 2021.

    Still works perfectly fine.

  4. 7 months ago
    Anonymous

    For me, its playing games at 4k 120fps on my 55" LG B9 OLED.

  5. 7 months ago
    Anonymous

    Care less about resolution. Your eyes are most sensitive to three things:

    >Brightness
    >Contrast
    >Color Accuracy
    HDR improves all three of these things provided you're using a OLED and not some MiniLED.

    • 7 months ago
      Anonymous

      I don't care about any of these things. Resolution is the most important, because resolution means I can fit more things on screen at once.

      A lot of morons seem to have confused "resolution" with "high DPI", when these are completely different concepts. Don't feel bad, lots of people are this stupid, including EVERY PERSON WORKING AT VALVE, because they like completely fricking up their UI design to appeal to the high DPI crowd, when played on a normal monitor your health value in HL2 is about 6 fricking inches tall because they assume every 4k screen must be 24".

      • 7 months ago
        Anonymous

        >Resolution is the most important, because resolution means I can fit more things on screen at once.
        And if things fit on screen with less then any more is a waste. People had no problems playing vanilla WoWs hardest raids on 1024x768 monitors or BG3 on Steam Decks 800p screen.

        Today homosexuals need 4k screens just to wipe to addon DBM bosses.

        • 7 months ago
          Anonymous

          >And if things fit on screen with less then any more is a waste.
          I think anyone who isn't moronic can understand that being able to see more things is better. In any 3D engine not written by rapists and Black folk, FOV is adjustable, which means that you can fit more on screen with a higher resolution.

          You absolutely can play HL1 as in the bottom image. But why would you want to?

          >muh WOW and BG3
          Oh, never mind, you're a troony. Come back when you've grown up.

  6. 7 months ago
    Anonymous

    4090 + 7800X3D anon here.

    What does this mean for me

    • 7 months ago
      Anonymous

      it means that you can get 1080p/240Hz and 2160p/120Hz even in new games, and that it's basically the best possible performance right now until something like a 8800X3D or 5090 is released.

      • 7 months ago
        Anonymous

        Well that's false because I play at 2k and even Baldur's Gate 3 drops to low 110s in the main city.

        Cyberpunk 2077 (fully maxed out tho) is like 130~ fps. Game optimization just sucks and will continue to suck

  7. 7 months ago
    Anonymous

    Cherry picked as frick.
    At launch of the 4090, these numbers were correct.
    1 year later and I run tlou at 4k 90 high WITH dlss.
    Plenty of AAA dont cross the 80 fps on a 4090 at 4k without upscaling, some much lower.
    T. 4090 owner who gamed at 4k since the 2080ti

    • 7 months ago
      Anonymous

      all of the data is from TechPowerUp

    • 7 months ago
      Anonymous

      Cucksoles hit their limits very hard, hardware requirements should stabilize a little.
      Devs/publishers still need to sell their shit games and if only a small % of users are able to play the game then their profits will suffer.
      I still hate how some games stutter no matter what, you can get 200+ FPS and still get a lot of long stutters.
      1% lows are important.

      • 7 months ago
        Anonymous

        >I still hate how some games stutter no matter what, you can get 200+ FPS and still get a lot of long stutters.
        >1% lows are important.
        Setting a frame cap with RivaTuner supposedly solves this entirely.

        Say you have a 1080p/144hz monitor, and the game you play has 180 average but 115 1% low, which is obviously annoying. You set the frame cap to 144 with rivatuner, and it makes the frametime completely flat, so the 1% low becomes 144.

        I haven't tried it myself, but it seems to work, and gives you a stable experience. It apparently works "at the CPU level" and works with both Nvidia and AMD GPUs.

        • 7 months ago
          Anonymous

          It doesn't save you from stuttering but it will lower perceived difference , I cap my framerates to the 120 or 90 depending on the game. This also reduces input lag vs uncapped framerate if you hit 100% GPU and aren't using Reflex or Anti-lag.
          If a game freezes for 150ms there is nothing you can do, even 30-50ms spikes are vomit inducing.

          • 7 months ago
            Anonymous

            I think it legitimately saves you from the stuttering. It frees resources to handle spikes. This is my cringe hypothesis. It seems to do that, from the videos I've seen.

            • 7 months ago
              Anonymous

              It depends.
              If you are getting traversal stutter or PSO stutter in UE4/5 game you will get exact same freeze even if you cap your framerate.
              But if you are just rendering more demanding scene and you have some performance headroom then you won't experience nearly as many drops. But that still depends what component is the bottleneck, it might not matter if you are VRAM starved for example.
              Still running games uncapped is a bad idea for the most part.

  8. 7 months ago
    Anonymous

    Feels like that going for 1440p is still the sweet spot.

  9. 7 months ago
    Anonymous

    small

  10. 7 months ago
    Anonymous

    I play 240p on a small handheld, the little screen allows me to emulate old vidya and the low battery consumption lets me play for long periods 🙂

  11. 7 months ago
    Anonymous

    gay ass thread

  12. 7 months ago
    Anonymous

    Ive got a 4070TI and a 7800x3d. I want to get an oled monitor, what are some good monitors?

    • 7 months ago
      Anonymous

      There's no such thing as a good OLED monitor, as they're not made for desktop usage.

  13. 7 months ago
    Anonymous

    i went from a 1080 ultrasharp to 1440p ips to 144hz 1440 ips to 240hz 1080. ive used 4k gaming monitors at friends house and a "gaming" 4k tv which sucked ass at home. i need a 240hz monitor with over 280fps or a crt. i move very quickly and anything else just doesnt feel good. the 1440p 27 inch were nice for general use but i found it harder to view when i was really trying to play games hard. id like a 24" 1440p 240+. i didnt really enjoy 4k at all besides watching videos in 4k and crazy multitasking.

    • 7 months ago
      Anonymous

      Are you primarily playing competitive FPS?

      • 7 months ago
        Anonymous

        i pretty much only play cs,bf,osu,and rs. every once in a while i play something with friends.

  14. 7 months ago
    Anonymous

    >1080p/60hz monitor
    >7800X3D/4090
    >getting 2,000fps on HL1
    >GPU and CPU at 100% utilization

    • 7 months ago
      Anonymous

      >still stutters down to 50FPS randomly for no good reason at all

  15. 7 months ago
    Anonymous

    I want a 32" 4k oled hdr1000 monitor.

  16. 7 months ago
    Anonymous

    >modern hardware
    >modern video games
    lamo

  17. 7 months ago
    Anonymous

    About to order a 7800X3D/7900XTX/4K 240hz combo, am I moronic?

Your email address will not be published. Required fields are marked *