I'm looking to buy a 1440p monitor. I can't decide if I should get a 75hz one or spring for 120+hz. (It looks like you can't even buy one with less than 75.)
I play a lot of emulated games, very few FPS or racing games, and when I look on the pcgamingwiki many of the games I play seem to have issues with framerates above 60.
Also I have an rtx 2060, so I'm already concerned about running 1440p60 in the first place. Having a better monitor means it will hold up whenever I upgrade my GPU, but I have a feeling my next upgrade will be for 4K instead of higher framerates.
Idk, hoping for advise but expecting to just be called poor.
Tip Your Landlord Shirt $21.68 |
1440p 120+fps > 4k 60 fps
Even if you only run your games at 60, the higher refresh monitor is great for better regular browsing. Every single non gutter display has freesync so the monitor has no tearing when you're under the monitor's max refresh rate too.
>1440p 120+hz monitor
>can either play in full res or memescale from 720p integer scaling for massive framerate boost
it's the perfect spot to be.
We live in a world where 4k 144hz exists, no need to compromise
>144hz
This is low though. I can easily see the difference between 300+ and 144. I have great eyes though
No you don't.
Recently got a 4K 144hz monitor and I can't tell the difference from 1080p 60hz. Yes I have a GPU worth more than its weight in gold to run it. You people are moronic.
Did you also recently get brain damage, by chance?
>I can't tell the difference
>You people are moronic
Ironic.
>and I can't tell the difference from 1080p 60hz
Either you actually didn't go into your settings and switch to 144hz, or you have severe issues with your eyes or brain, or both.
Are you using HDMI 1.3 cable or something?
Did you even turn up the hz in windows display settings?
Same here, but got a 360hz 1440p monitor and I don't see any difference with my old 50hz 768p monitor.
>Every single non gutter display has freesync
Tried out several monitors last year and VRR has been shit every time. It's way less hassle to just buy a 240Hz monitor.
Get a job. Save up for an OLED.
Shit that has poor quality and costs more is worse than just being a cheapass because now you're just a dumbass who paid premium for cheap shit.
this anon knows whats up.
people who buy top of the line for bragging rights are complete morons. mid tier is usually the best because it's affordable and it's not garbage that will fall apart or underperform.
Shut up poor
Spring for the one that has BFI.
High refresh rate is pointless without it.
OLED is stopgap technology with all the flaws of LCD and Plasma combined. The sooner we get rid of it the better.
OLED is great though, with strobing like BFI, it literally has zero flaws except burn-in, a lot of the newer models even have very high brightness, so that's no longer an issue either. It's purely poorgay cope. Extremely low input lag, extremely (near-zero) pixel response time, perfect per-pixel blacks, great color, great viewing angles, great uniformity, etc.
>zero flaws except burn-in
Which is a massive fricking flaw, especially if you either play one game a lot, or you use your computer for productivity stuff for multiple hours per day.
Burn-in is massively overblown. I used an OLED phone for hours every single day for 8 years and never encountered any burn-in, and that's with static icons and UI elements. It seems like a problem that is almost exclusive to morons who blast full brightness with static content for 12+ hours a day.
And yet when people do burn-in testing for TVs and monitors, it's pretty easy to replicate. On some TVs and monitors, it already shows up at less than 1000 hours. It's just that it's usually so gradual that most people don't even notice it.
Your display will last like 20x longer if you just turn down the damn brightness from max. I'm not saying it doesn't exist, it's just a completely unnoticeable non-issue unless you blast static elements with max brightness. Even doing the burn-in tests of a single color on my phone, I don't notice any uniformity issues and I used this every day for nearly a decade.
OLED have an insane problem when you use VRR or ALLM
ive not had this particular problem, but the gamma flicker on my OLED when using VRR is really annoying.
that said its also the nicest monitor ive used and i love it. There's just not perfect monitor. Buy whats in your budget and enjoy it.
just get an ultrawide that does 144hz. it's not difficult. just don't go for oled, those things are horrible for precise colors.
I got my first ultrawide with 120hz bought the cheapest chinese noname, just to test if I actually like it compared to 16:9, and it's been a revelation. I don't think I can ever go back after losing my 21:9 virginity
you can get a decent 1440p 144+ hz IPS monitor for around $200 these days
Just get 144hz. You pay very little for that extra capability and monitors are the kind of purchase that can last a long time across multiple computers.
I think I will go for the higher frame rate. I've been eyeing up this monitor, it's under $200 and is rotatable which seems useful. Does this raise any red flags?
I mean you don't have to make the decision right this second. There are probably 100+ options for monitors in that res and refresh rate. You can look around and compare them or look at reviews on the monitors and see which hold up.
That's true. It seems like every monitor has 1 star reviews saying that it broke for that person. It's hard to tell if the monitors are faulty or a handful of people are inevitably going to have bad luck due to shipping or something else.
this is probably better value in that range
I actually saw the 32" version of this at a pawn shop. I almost bought it, but it would be too tall to fit in my desk. Most 27" monitors should fit.
I'm trying to avoid "gamer" aesthetics, also that monitor is missing an audio-out port which would complicate getting audio from a console or something.
Bought the 31.5'' version of this one a month ago, didn't expect it to be so much bigger than my previous 27'' LG. No complaints so far save for its speakers being expectedly bad. But who even uses them?
Ganker recs this when it's on sale for $200
https://www.dell.com/en-us/shop/dell-27-gaming-monitor-g2724d/apd/210-bhxc/monitors-monitor-accessories
BenQ Mobiuz EX3210U. thank me later.
Seconding BenQ, excellent material and color quality for the price. The Dell and ASUS monitors I tried in the same price range were plastic crap with serious color and vignetting issues.
U can run them at 60hz the stated spec is just the max spec
Anyways i would say go for oled if u can it makes a big difference even for old games that cant run at 144hz or whatever
Don't listen to this anon
Some monitors run worse at lower specs than they were tuned for. You can run a monitor at it's set refresh rate and still play games at any framerate and benefit from it.
>monitor
>2024
Be a fricking man and buy a TV as your monitor instead.
t. currently Sony 55' OLED
>>2024
>Be a fricking man and buy a TV as your monitor instead.
>t. currently Sony 55' OLED
BASED AND TRUTHPILLED
NORDMENDE 50 Uhd 4k INCH HERE
Gigachad
>NORDMENDE 50 Uhd 4k INCH HERE
u wot m8. Never heard of that brand and looked it up. Is it really 50hz? Seems like a weird number.
75hz 1080p
>captcha AY M8
Why would you buy 1440p in 2024? Just get a 4k 144hz monitor.
Maybe because nothing can run a 4k 144hz monitor?
You don't use your computer for anything other than games?
I do lots of shit and 1440p is enough for me. I have two 1440p 27" screens.
This is just like back when morons would defend 1080p and called 1440p a scam.
Or the morons who said IPS was a scam.
Or the morons who said 120hz was a scam.
My monitors are 144hz and 170hz. Both are 27" 1440p IPS. I'm not going to play games at 60hz just for 4k. As I said nothing can run 4k at 100 + fps.
Depends on the game and what GPU you have
My 7900XTX does pretty well but it's still a big hit compared to 1440p
Why would you play games at 60hz in 4k? Just turn the resolution down to something like 1800p or 1440p. FSR and DLSS look almost as good as native 4k while running at 144hz too, but I have a feeling you're some kind of purist who hates them even though they only look like shit on 1440p and 1080p monitors.
You can get 4k for $250 on the low end to $400-$500 for one that does 144hz. You're buying a high end PC but slapping a shitty monitor on it, makes no sense.
32" 4k is the sweet spot btw. 1440p 32" looks like shit.
32 inch is too big for a monitor on a desk.
32" is the perfect size. It's like having an ultrawide that doesn't have the vertical screen space removed.
lmao fricking called it. You let others dictate your life rather than doing what makes the most sense. Stop being so cucked by Ganker, you're never going to benefit from virtue signaling here.
Nah. Even my 28 inch is too big and I have to protrudes off the back of my desk. And my desk has a lot of depth. Definitely not the perfect size. Maybe you if have a 5 foot deep desk
>desklet with his 60cm depth desk complaining
sounds like a (You) problem not a monitor size problem
So my desk is 3.5 feet deep and it’s not enough for 28. How about yours?
>106cm depth
yeah post your desk you lying sack of shit
i use a 32" on a 80cm and it protrudes thanks to the stand by like 20cm so my distance eye to monitor is like 80cm
Got it. So you have a suboptimal distance.
>hurr durr suboptimal distance hurr durr
why dont you post the science then, did you know your eyes have a 180*ish FOV?
I mean you just told me your dimensions. I don’t know what else to tell you. I hope you get a bigger desk in the future since it even isn’t enough for 28
>im trolling
good to know, no need to reply to you anymore
You didn't call shit. I know that nothing can run 4k at 100+fps and that is what I need for my games and use.
You can't run your browser and file manager and image viewers at 144hz 4k? Bullshit, even a 10 year old gaming computer can do that.
How about I fricking kill you, how about that? HUH?
You can't because I'm a white male and you're a troony.
I am fully white and male. How about that you fricking c**t?
>sold his DNA data to israeli testing companies
Jesus save your soul.
lmao you actually sent the israelites your genetic makeup
>I am fully white
>53% Irish
>monitor on a desk.
>Look at my computer desk son! I store my commemorative plates from the Great War of Northern Aggression in it to. Back in my day we sat in front of these tiny little screen we called 'monitors' and rubbed mouse balls"
>Why would you play games at 60hz in 4k?
You had better not be the guy I was responding to. I'm not going to use FSR or DLSS in my games. I need real frames to play multiplayer.
>32" 4k is the sweet spot btw. 1440p 32" looks like shit.
The actual sweet spot is a 27" 1440p 144hz monitor. Suggesting 4K is fricking moronic these days because 9/10 games are extremely unoptimized and it's not even feasible for 90% of people to have the hardware to actually run 4K properly. 4K might LOOK a bit better, but it FEELS like absolute garbage when you have to run everything at 60-80 FPS and hz.
t. 7800X3D + 4090 using a 27" 1440p 240hz monitor
Why do you buy a high end gaming PC and only use it for playing video games?
Why do you intentionally cuck your hardware in literally every other task like programming, web browsing, movies/anime, or even just basic screen space?
Depth doesn't matter though, it's width. Even a 32" monitor only takes up a bit of space when it comes to depth.
Uh it’s not about space, it’s field of view.
You get both with 4k 32".
Even 1440p and 1800p look fine without FSR/DLSS. But there's no reason to avoid using them when you have 4k. Ganker actually virtue signals about it. Then again this is the same board that used to virtue signal about TN panels a decade ago.
Yes you get a bad field of view because your monitor is too close to your face
>Why do you buy a high end gaming PC and only use it for playing video games?
Because I spend 90% of my time on my PC playing video games.
>Why do you intentionally cuck your hardware in literally every other task like programming, web browsing, movies/anime, or even just basic screen space?
Why do you want to cuck people into playing video games at sub-par FPS and refresh rates?
>Why do you buy a high end gaming PC and only use it for playing video games?
Ganker - Video Games
>32" 4k
ghey and smal
>nothing can run 4k at 100 + fps.
2016 called, they want you back.
reddit called, they want you back.
People said that when the shit was wildly overpriced for what you got which isn't the case anymore. 4k is overpriced. When 8k comes out 4k will be worth it. Except for television, definitely get a 4k TV.
4k is so great for non gaming
you can integer scale 1080p on a 4k display and get better than native 1080p on the same inch display you dumbfricks Ganker is literally the dumbest board you always act like HURR DURR IF YOU BUY X RESOLUTION MONITOR YOU CAN ONLY USE IT AT NATIVE HURR DURR HIS POTATO PC CANT RUN IT (PROJECTING)
Shh they're not ready for integer scaling. Let them fight over stupid things
144Hz is great. If you emulate you can even use BFI which makes retro games even better.
Get higher refresh rate
It's a huge upgrade over 60hz and any game that has issues lock the framerate through the GPU.
>but I have a feeling my next upgrade will be for 4K instead of higher framerates
I recently upgraded to 4K and it's been a mixed bag, on one hand the extra screen space on the desktop is wonderful but for games it really wasn't as huge as I hoped like going from 1080p to 1440p and modern games run worse and I get sick if the framerate is too low or inconsistent.
Also you need a beefy GPU to use 4K without upscaling.
Do you remember what it was like going between 30 and 60 fps and how jarring 30 fps becomes? That’s what it’s like going from 165 back to 60. Get the higher refresh rate
get 4k 320hz monitor, its 2024 not 2018
Buy yourself a second hand 1440p 144Hz monitor, there will be some rich idiot selling a nice one near you to make room for his brand new monitor
When I last bought a monitor I found it quite confusing, there are pros and cons for each type of monitor, and no clear best technology right now, maybe OLED is it, maybe not, time will tell and as usual, it's not worth buying top of the line either way
But 1440p 144Hz is a really nice middle ground right now I think, both in terms of monitor price (especially second hand) and hardware requirements
An RTX2060 might not play everything you want to play maxed out, but you'll likely keep your monitor a lot longer than your GPU, and high refresh rate is better than high resolution IMO
>lmfao poor
Who the frick cares, the best thing about PC gaming is how cheap it can be, I used an old office computer and monitor with an even older and shitter second hand GPU for a long time
This is the best way to go if you're going to buy 1440p. There's no reason to buy one new. If you're going to buy a new monitor get 4k. Otherwise just go find a used 1440p monitor from a few years ago that used to be top of the line.
Buying new 1440p is just moronic because you're paying more for less.
That's great. Most people use their gaming PCs for work and gaming, not just gaming.
>sub-par FPS and refresh rates
A 4k 144hz monitor will get you the same framerates when you render games at 1440p, what the frick are you talking about? You are limiting your PC for no reason. Even browsing the web or watching youtube videos is better on 4k.
That's wrong, Ganker buys high end gaming computers and uses them to shitpost.
>A 4k 144hz monitor will get you the same framerates when you render games at 1440p
What? The frick is this? Unless I'm not understanding you correctly, 4K will be a massive hit to your framerate compared to 1440p gaming.
You can turn down the resolution, homosexual. If you use DLSS to upscale it then it'll look almost as good as native 4k (or BETTER than native 4k if the game forces TAA on you).
If you use built in scaling on your monitor then it's still going to look good because 4k monitors under 42" have high pixel densities, which make the scaling less blurry.
If you're a mega autist who can't STAND non-integer scaling then just play your games at 1080p with perfect 2x2 scaling. Then you get even MORE fps.
Yeah, that's not how it works. 1440p looks terrible on a 4K monitor. This is absolutely horrendous advice. Stop bringing up fricking garbage like DLSS and all that trash.
Fully commit to 1440p or 4K.
This troony here doesnt know what integerscaling is. Has nothing to do with DLSS/FSR.
See
>If you're a mega autist who can't STAND non-integer scaling then just play your games at 1080p with perfect 2x2 scaling. Then you get even MORE fps.
You can't refute my arguments however, you simply don't know enough about the topic to even have a good faith discussion. You're going to be another one of those morons who complains about something now and praises it later, just like you did with 120hz and IPS panels.
which one of you is the "optimal distance" homosexual
The other one. There is no "optimal distance" now that UI scaling exists. If you have your face right up against the screen if you like, just makes 100% scaling that much easier to use.
Integer scaling is great for old games that were made to only work at specific resolutions but it's a shit compromise for modern games, you're rendering one quarter of the pixels of your monitor, and for what? So that some autistic anon can be happy about someone else's setup?
Which is why everyone who isn't a massive moron uses DLSS or FSR on 4k. You render at 1440p and get visuals that are nearly as good as native 4k.
>comparing upscale to TAA
Have you ever played RDR2 without TAA? It looks like absolute dogshit thanks to all the leaves and grass. TAA is too blurry, no-TAA is too shimmery and unrealistic. FSR and DLSS are sharp without the shimmer.
>FSR and DLSS are sharp
What in the frick are you saying? The whole downside of FSR and DLSS is that they aren't sharp and always make things muddy/blurry. That's why there's always a slider included to increase sharpness artificially.
The point is DLSS looks better than native 4k+TAA and only slightly worse than native 4k in a game that force TAA.
There are no downsides to it. There's no excuse to be buying new 1440p monitors in 2024. If you REALLY want 1440p then go buy a used monitor for under $200.
Linux doesn't downscale, dumb frick. It renders everything larger, which you can see in my image because of how large the text and UI elements are. No wonder why you use 1440p, you don't know how computers work.
then why do your screenshots look like dogshit? your font rendering looks like shit
Because I was showing you how you can read people's screenshots using IMAGE HOVER.
Though I'll give you a pass on that because I realized how confusing the screenshot was after I posted it.
Pic related, notice how the text is larger to compensate for the smaller pixels. Everything looks sharper as a result.
Ancient dogshit 3600 Ryzen and a 7800 XT. Like I said, no excuse. Just use FSR and DLSS and it looks almost as good as native 4k. Just don't use framegen.
>Ancient dogshit 3600 Ryzen and a 7800 XT
It's laughable that you are advocating for 4K monitors, while you can't run 4K games.
Are you phone posting right now? That would explain why you can't imagine using a gaming computer for anything other than gaming. moron.
Pic related is better, you can see how hovering over an image makes everything on the screenshot readable. Blurry and ugly non-integer scaling? Yes, that's how image hover works.
But you can't do this on a 1440p screen without having a fullscreen browser.
>See how my text is small? See how your text is big? That's downscaling.
lol
What the frick are you talking about?
Look at the width of this image. Can you read the text? I can. This is image hover on a 1440p display with the browser at a width of 900 pixels.
This guy is a fricking moron.
And now look at how blurry your text is everywhere in your screenshot.
You also have blurry thumbnails too. See
at this point you're just self clowning holy frick you're moronic
He literally can't read that text. His screen is so small and his PPD is so high that he cannot physically resolve what you or I can. You have to save your images upscaled 150% for him to be able to see what we see, because he doesn't understand just how badly he fricked up.
>That would explain why you can't imagine using a gaming computer for anything other than gaming. moron.
We are literally on a video game board, talking about monitors for video games, and you keep going "BUT MUH NON-GAMING DESKTOP EXPERIENCE". Are you okay in the head?
>But you can't do this on a 1440p screen without having a fullscreen browser.
He could literally do that. Assuming your image is the extent of your browser size, let's scale it back down to how text would look to a normal person who didn't fall for the scaling meme. It's about 1000px wide, let's be nice and call it 1200px because you've got some stupid shit on the side.
Well, 1440p has a horizontal resolution of 2560. You can fit 2 1200px windows side by side into 2560 no problem.
This is the problem with morons. You can't fricking count. You're using a 1440p screen.
You have no idea how pixel density works. You actually are so moronic you think it's "downscaling" to render everything with more pixels.
If you buy a 1440p monitor just because someone told you it's "better for gaming" you fell for a scam. Sorry anon. You willingly gimped your computer when you could have easily got the same performance with DLSS on a 4k monitor.
Now you must be trolling. No one can be this stupid.
>you fell for a scam
You literally bought a 4k monitor while having a 3600 Ryzen and a 7800 XT. You are the one who fell for a scam. Why the frick are you talking about anything being "gimped" when YOUR setup is absolutely fricking gimped.
I used to play with 2600 ryzen on 4k monitor and it was fine because most games these days support dlss
Even with DLSS, you are still extremely gimped with a 3600 Ryzen and a 7800 XT and attempting to play games in 1440p, let alone 4K.
Avatar: 7800XT - 44 FPS average at 4K with DLSS/FSR
Alan Wake 2: 7800XT - 51 FPS average at 4K with DLSS/FSR
Cyberpunk: 7800XT - 36 FPS average at 4K with DLSS/FSR
Starfield: 7800XT - 60 FPS average at 4K with DLSS/FSR
Lords of the Fallen: 7800XT - 43 FPS average at 4K with DLSS/FSR
Robocop: 7800XT - 55 FPS average at 4K with DLSS/FSR
>source: https://www.youtube.com/watch?v=x6EfOf0ZoAM
In conclusion, you are a fricking moron who fell for a scam.
44fps is fine I would prefer game in 44fps 4k than 1080p and any fps really
>44fps is fine
You are mentally damaged.
>7800 is overkill for the games I play
>Video games aren't "literally everything" Black person.
Holy frick the amount of cope. Just admit that you literally got scammed. You bought into buying a 4K monitor while having a system that can't even get a steady 60 FPS at 4K WHILE having DLSS/FSR cranked to the max.
Now all your posts make perfect sense. You are literally in post-purchase rationalization mode. It's clear as day.
Don't play dumb homosexual. You saw my posts where I told you people use their computers for more than just video games. 4K makes everything better, INCLUDING video games.
There are the same amount of pixels on all 4k screens. The size you fit them into determines how sharp your display is. If you stretch 4k out to a 40" screen then it's as blurry as your 1440p monitor. 32" 4k is slightly sharper, 27" 4k is way sharper, and 24" 4k is extremely sharp.
All of the 4k options are as sharp or significantly sharper than 1440p 27" screens.
Pic related.
>4K makes everything better, INCLUDING video games.
There's that post-purchase rationalization again. Meanwhile you can't even properly run most modern games at 4K with your setup, even with DLSS/FSR cranked to the max.
>in b4 "I ONLY PLAY OLD GAMES, SO I DON'T CARE"
Keep projecting anon. I knew what I wanted before I bought it. I would have gotten 5k 27" if there were any good monitors, unfortunately there aren't.
You're a frame troony that counts frames instead of playing games. People have been gaming in 20fps max resolution since the 90s. For majority of games visuals > framerate
>People have been gaming in 20fps max resolution since the 90s.
I was playing UT99 and Q3 at 100 FPS + 100hz on my Samsung SyncMaster in 1999 while you were still in your dad's ballsack.
>i was playing multiplayer slop in the 90s
Cringe flex
The golden age of gaming was from 1990 to around 2007-2008. You are just a bitter little zoomer who was born into the era of most games being actual slop.
>the golden age of gaming was when I kept upgrading my resolution and didnt care about frames
>*stops upgrading resolution, becomes a frametroony*
>gaming le dead oh noez Dx
I got a Samsung Syncmaster back in 2010 for my 13th birthday
You are a poor third world country moron with brown skin. Why do you have nasty ass trannies living rent free in your head?
Go back to playing pong in 240i 999fps and pretending to enjoy it
>and pretending to enjoy it
Unlike you, I actually do enjoy video games. Which is why I have a 7800X3D + 4090, so I can play every video game with maxed out graphics and maximum smoothness.
>Unlike you, I actually do enjoy video games. Which is why I have a 7800X3D
moron paypig detected
2WHY THE FRICK WOULD YOU PLAY VIDYA ON ANYTHING LESS THAN A 50 INCH SCREEN THESE DAYS? ARE YOU A TARD?
I'm sorry that I am not from a third world country like you, and that I actually have a lot of money. Please forgive me.
>Unlike you, I actually do enjoy video games. Which is why I have a 7800X3D + 4090, so I can play every video game with maxed out graphics and maximum smoothness.
STOOOOOOOOOOOOP
The bottom has the most sovl. 綺麗な漢字.
Sharpness is a consideration, but sharpness corresponds to a loss of real estate. The sharper you want the image, the more pixels have to be wasted to achieve something pointless.
Here's the thing, the text is just as readable at 132ppi there. There's a reason the industry settled on 60 PPD decades ago, sharpness is a balancing act. Too sharp, and you're wasting pixels. Not sharp enough, you can't read shit. Personally I argue that increased sharpness past the 20/20 vision standard is a waste of time until you have a monitor that encapsulates your FOV, and that doesn't happen for monitor sizes under 80 inches.
>The sharper you want the image, the more pixels have to be wasted to achieve something pointless.
Imagine having to cope by literally typing out "sharpness is pointless". As if having a 32" screen that's sharper than a 27" screen from 10 years ago is a bad thing.
I was one of those people suing 27" 1440p back when you morons said 1080p is all you need.
I'll be there using 8k when you try to tell me 4k is all you need.
>As if having a 32" screen that's sharper than a 27" screen from 10 years ago is a bad thing.
It's bad when you could be using a big display for the same performance cost and get a hell of a lot more real estate. You're literally just using a 1440p screen but the text looks a bit nicer. You'll be using an 8k 40" monitor and raving about it while I'm using an 86" 8k monitor, and I'll be laughing at you then, too. In 20 years, you'll finally get on my level as you upgrade to a 16k 86" monitor, but you've got a long and sad road ahead of you before then.
The only 32" monitors worth buying at 4k. 1440p at 32" is completely moronic. Blurry garbage.
>You're literally just using a 1440p screen but the text looks a bit nicer. Yes.
>You'll be using an 8k 40" monitor and raving about it while I'm using an 86" 8k monitor,
No, I'll be using 8k 32" because 32" is the sweet spot for gaming and real work.
Why are you attempting to "lil bro" people when you have a 4K monitor with a 3600 Ryzen and a 7800 XT?
>a 4K monitor with a 3600 Ryzen and a 7800 XT
Peak setup
I'm a guy with an actual 4k monitor, not this poser.
>32"
Let's assume you're sitting 2.5 feet away. At that distance, you're getting about 77 PPD, which is the average vision limit. At 8k, you'd be getting 154 PPD, which is so far beyond what can be resolved that you'd have to be a fricking moron to buy such a display. Hope you wear glasses, buddy.
The human eye can't see more than 1080p 24fps
I don't play AAAslop. 7800 is overkill for the games I play, but it can do AI without being nVidia so it has that.
>Pixel density doesn't matter.
It does if you don't want blurry text, blurry UIs, blurry thumbnails, and blurry video artifacts.
And your image doesn't matter, there are more pixels per inch in a 4k 32" than your 1440p 27" monitor. Facts do not care about your feelings.
Video games aren't "literally everything" Black person.
>And your image doesn't matter, there are more pixels per inch in a 4k 32" than your 1440p 27" monitor.
There are more pixels in the middle screen on that image, I'm sure even you can verify that.
Can you see more of the background than the 1440p guy? It's a simple fricking question.
You have gimped your screen. Well, really, your screen was gimped from the factory, being too small to take advantage of the true benefits of higher resolutions, what you did to it after was just to get a usable experience.
You willingly bought a low resolution screen just to avoid using DLSS/FSR. You could have had a 4k monitor that makes literally everything better, but you instead chose to buy 1440p in 2024 so you can virtue signal on Ganker.
If you want to be proud about being a moron on Ganker don't be surprised when people call you a moron.
There's a few enlightened anons in this thread, unfortunately you savages are the majority.
A 4k 32" monitor has 135ppi.
A 1440p 27" monitor has 105ppi.
Which has the higher pixel density?
>You bought a 1440p monitor. When you run a monitor with scaling this severe, you are using it at 1440p.
"Scaling" is when you render everything with more pixels. You seem to think there's some kind of quality loss when you are adding significantly more pixels to show quality with. Again, completely fricking moronic.
>Buddy, I know you can't read that text.
And now you're not even making sense. Stop while you're ahead anon. People who understand computers can see how cringe this is. You're lucky that it's more cringe I'm still arguing with (You) though.
There's mods for that these days. And like I said you can drop the resolution to 1800p without it looking like a blurry mess. 1440p is kinda blurry but still perfectly fine for most games.
You can just avoid it by using 120hz tho.
>>A 4k 32" monitor has 135ppi.
>A 1440p 27" monitor has 105ppi.
>Which has the higher pixel density?
Pixel density doesn't matter.
Look, I've developed this image to show you. On the left, is that guy's 1440p screen. In the middle, that's your 32" "4k" screen. On the right is my screen. Do you think the guy in the middle is getting a better experience than the guy on the left, or is it the same but a bit bigger? Can the guy on the right see more of the exciting white background, despite having the "same" resolution as you?
>You seem to think there's some kind of quality loss when you are adding significantly more pixels to show quality with
Yes, there is. You're losing real estate.
>You could have had a 4k monitor that makes literally everything better, but you instead chose to buy 1440p in 2024 so you can virtue signal on Ganker.
You are a clueless moron. See
>You have no idea how pixel density works.
Says the guy who has no idea how pixel density works.
>If you buy a 1440p monitor
You bought a 1440p monitor. When you run a monitor with scaling this severe, you are using it at 1440p.
>Now you must be trolling.
Buddy, I know you can't read that text. You just said it looks blurry, you've PROVEN you can't read the fricking text. I know you can't read the text because your browser is operating at 150% scaling, you'd have to zoom in on the image. To you, the text is too small, because everything on your monitor is too small. That's why you have to scale everything bigger.
>could have easily got the same performance with DLSS on a 4k monitor
you know not every game supports dlss? and you still get better performance with dlss with 1440p compared to dlss with 4k. no matter how you look at it 4k is worse for performance
which screenshot are you even referring to that we cant read using image hover?
>There's no excuse to be buying new 1440p monitors in 2024
What are your computer specs?
>Linux doesn't downscale,
See how my text is small? See how your text is big? That's downscaling.
I know this is hard to understand because in your peanut brain "text big = upscaled". You have taken a 4k monitor, and made everything smaller. Your actual resolution is 1440p level, due to scaling. This is why you can't fit as much stuff on your screen as I can, and why it's really funny that you're whining about 1440p when you're using a functional equivalent because you bought a monitor that is too small. You literally don't know what 4k is, your eyesight is probably too bad for it. Move about a foot closer to your monitor and turn off scaling for a taste, but you'll probably hurt your eyes.
All of those games just launch fullscreen 4:3, they're auto-centered. It's about as complex as saying I want the game in 4:3 and not 16:9. The last one auto-opens in the middle of my display, but I typically moved it to one corner so I could open my notes on the other screen regions.
TAA basically is upscaling. It allows you to render at lower resolutions with less aliasing than you would normally get at low res by sampling multiple frames.
>A 4k 144hz monitor will get you the same framerates when you render games at 1440p, what the frick are you talking about?
So you are advocating for buying a 4K monitor, just so you can get more screen estate for "work", but then always use downscaled 1440p(which will look worse than on an actual native 1440p monitor)? If so, you are one incredibly weird person.
>or watching youtube videos is better on 4k
How often do you watch Youtube videos in 4K?
Why would you need to use scaling on 32" 4k? Just put on your glasses if you don't have aryan 20/20 vision.
>downscaled 1440p(which will look worse than on an actual native 1440p monitor)
This is where you show you have no clue what you're talking about. You are using more pixels to render everything, which means the 4k monitor will be significantly sharper.
This is your final (You). Go learn a bit more about computers before you try discussing them with someone on my level.
Which is why you would use DLSS or FSR. But you refuse to do so because you want to virtue signal to people on anonymous image board. No one is impressed that you refuse to use DLSS.
>This is where you show you have no clue what you're talking about. You are using more pixels to render everything, which means the 4k monitor will be significantly sharper.
You are insanely clueless. No wonder you are advocating so hard for 4K monitors. It's also fricking laughable that you are suggesting people to use DLSS/FSR in combination with 4K, while talking about clarity/sharpness and hardware requirements. Clueless moron.
Post a picture of your setup with your digits on screen
Here. It's missing another 1TB Samsung 980 Pro that I had to RMA. My main monitor is a Samsung Neo G7 27" 1440p 240hz(I play FPS games).
I said picture of your setup not a fricking screenshot
Why the frick would I go through the troubles of doing that, when you are just going to claim that it's not mine anyway? I just showed you my specs.
Here's the model name of my monitor.
to prove your 3.5ft desk and setup
>to prove your 3.5ft desk and setup
I'm not the guy who has been talking about desk sizes. But if you want to know which desk I am using anyway, I'm using the old big Ikea Galant that's now discontinued. Pic related.
>70cm
so a desklet
It's 2,2m wide and 80cm deep. I have 3 monitors, a whole audio setup behind, and my tower(HAF 700, one of the biggest cases in existence) on it as well. If this is a "desklet" to you, then I don't know what the frick kind of monstrosity desk you use.
i converted 2,8 to cm and it says 71cm
So you are just too stupid to use calculators properly. I just used 3 different ones and got these results.
didnt you just say you have a 28"
You are confusing me for the other guy who was talking about desks. I have never mentioned anything about desks.
You refuse to use tools that give you a better experience. What do you gain from refusing to use DLSS
If you're happy with 60-75fps, go with 75. Don't ever upgrade unless you're unhappy with it, same with resolution. Once you upgrade it's hard to go back. What's the point in burdening yourself with higher standards and expectations that will leave you disappointed whenever you don't quite reach them?
144 is a meme, the sweet spot is 120
120hz is actually superior if you watch anime because it reduces judder.
>reduces judder
why not watch it in sync with the output fps?
Have you ever tried using 24hz? It's cancer. 48hz is also cancer, and with 72hz you cause judder on 30fps and 60fps videos (basically everything online). 120hz is the first refresh rate that covers all common situations.
Are you moronic or something?
You use 24hz for the display that runs the footage, in fullscreen, scaled with madvr or something.
And browse on your main if thats what you do. Why the frick would the 24hz on the anime affect your other experiences?
Also anime is 30 so use 30hz, movies use 24
>Not using BFI
ngmi
>low brightness and flicker
yeah nice
120hz for me I get judder like crazy (double triple quadruple image in panning shots) in anime and other 30 fps tv series. So I just setup mpc to auto switch to correct hz.
judder is a non issue with madvr smooth motion
How much are you looking to spend? I bought this G3223Q for $700 CAD ($511 USD) when it was on sale.
Hey big dog what's the source on this? Couldn't find anything.
im not sure about AMD but through Nvidia's control panel i can setup custom refresh rates.
my 1440p 144hz is set to 120hz. I can set it to 75, 90, 105, 130, anything i want and games show the refresh rate options my monitor has.
the site Rtings has a ton of reviews if you want a good one.
high fps and higher resolution youre gpu wont hold up for new stuff. but you should be fine if you really are only playing older games. you'll feel it if you need to upgrade or not.
1440p is the dumbest gorilla moron resolution around. I respect the guys staying on 1080p more, at least they're just lazy.
40"+ 4k is the first real monitor tech advancement in 20 fricking years. 20 years ago I had a 20" CRT on my desk, 19 years ago I had a 24" LCD (smaller than the 20"), and the slight boost to 28" was never anything to write home about, you gained a fricking inch of height. 43 inches? That's almost twice as tall. It's twice as wide. I can fit 4x more shit on my screen than I could at 1080p, multi monitor setups now no longer make any sense because I can just tile windows.
>40"+ 4k
Yeah, you are definitely mentally handicapped. Imagine using a fricking TV as your monitor.
>multi monitor setups now no longer make any sense because I can just tile windows.
You are absolutely braindead. Imagine having to actually run every game in some weird ass small window because you have other stuff opened as well and it's all on the same monitor. Sounds like an absolute nightmare.
Not that anon but there's 40" monitors. 4K at that resolution is the same pixel density as 1080p at 24".
If you don't care about having low pixel density like that, then it's the best option for real work.
Actually I meant it's the same pixel density as 1440p at 27". For 1080p it'd be more like 21".
It's not about pixel density. It's about having to sit so far away to be able to actually see HUD elements properly without having to physically move your head. If you sit at a desk, the amount of travel time it would take for your eyes to move up to the top corners to watch something like minimaps in games for example, or even to just move your eyes up towards things like minimize/window/close would be excruciating.
At that point you might as well be sitting in a living room watching a TV.
That's why I said "for real work". Games are better on 32" monitors because you can see the UI easily.
Using low PPI in 2024 is moronic. Makes everything from thumbnails to video game textures look worse. Also makes solid colors look less solid, especially on old pixel based games.
Why are you so eager to talk about things you have no concept of? Moving my eyes from screen center to where the menu bar would be at 1080p is no easier than moving it up to glance at the top of a 4k screen. I typically run side by side 1920x2160 windows for general browsing, and switch to full screen 4k for things that benefit from it, like high resolution porn.
You're acting like I'm anywhere close enough to have to move my head. Do you have to move your head to go between your two side by side monitors? Or do you just glance, moron.
>PPI
I'm talking about PPD, you homosexual. You know, the only actual judge of screen quality, the one nobody uses because they're Black folk. If you don't factor in your distance to the display, PPI is worthless. I could have a 1000 PPI display, but if it's 3 inches across and I'm viewing it at arms length, it isn't worthwhile.
Everything is designed for 20/20 vision. Scaling everything down to make it tiny is moronic, I play my old games scaled up to take advantage of the wide FOV.
But hey, you've got to justify all that money you spent on a glorified 1440p monitor, right? Must be sad, the only time to consider a high PPI display is if you have already maxed out your visual field, and that doesn't happen until you have an 80"+ monitor.
Name the 5 last games you have played, and what resolution you played them at.
Virtua Cop 2, 2880x2160. Silent Hill The Arcade, 3840x2160. The Lost World, 2880x2160, LA Machineguns, ditto. Confidential mission, ditto.
I imagine this isn't really the kind of list you're looking for, so the last modern game I played was Void Stranger, which I played in a window at like 1200x700 or some shit.
Says the guy who still can't tell the difference between D and I. Must be rough.
Everything is scaled down. Your image shows it, hideous artifacting from non-integer scaling, and the background shows you run your hardware at 150% scaling or some shit, throwing away most of your resolution. You are literally browsing at 1440p. This is what text should look like at 4k.
>2880x2160
>3840x2160
>2880x2160
>1200x700
You are a weird person if you are constantly switching the resolution and window sizes of your games, and you feel perfectly okay with that. That would be so annoying. How do you even make sure that they are always perfectly in the center as well, or do you not care?
What the frick are you talking about? Nothing is "scaled down" or "made tiny"
Everything is made significantly sharper, you can even read text using image hover.
Also my browser window only takes up half of my screen. You can't even do this on a 1080p screen, and probably only 1440p if you make your browser take up the entire screen like a mongloid.
>Also my browser window only takes up half of my screen. You can't even do this on a 1080p screen, and probably only 1440p if you make your browser take up the entire screen like a mongloid.
Ofcourse you can, websites are designed with 1280 or less width in mind.
A 1440p display can do 3 browsers depending on the site but ideal with 2.
I said "image hover". That means Ganker.
Here's a thumbnail without image hover. Try doing this on 1440p without it being unreadable.
>60ppi is fine
this monkey is literally blind if he doesnt see the pixelation around fonts and everything else, 80ppi was dogshit on a 27" 1080p display i had almost 2 decades ago
>Imagine having to actually run every game in some weird ass small window
I run all my games at 4k with high FOV. Well, older titles without widescreen support I run in a 35" 4:3 window.
Imagine thinking a screen size can be "too big". Meanwhile you'll happily use side by side displays that take up more fricking desk space and offer less. Only once you get past 80" do displays start getting too big for desks, which means
>low pixel density
60 PPD has been good for the past 40 fricking years. It's literally 20/20 vision. Sure, I could use a smaller screen for a higher PPD, requiring fricking scaling which turns 4k into a glorified 1440p with slightly sharper text. But no content is designed for high PPD. Only moronic macgays who can't use cleartype like scaling, it's wasting pixels you paid for.
>Imagine thinking a screen size can be "too big"
They absolutely can. See
are you stupid or something? 40" 4k= 90 ppi not 60
I'm just waiting on 4k OLED with 120Hz+ with BFI under 30".
I don't care if I have to pay $1000+, I just wish they would make smaller fricking displays, I don't want a fricking 32" display.
Guys what do you think about the PG32UCDM apart from the fact that it has the Asus ROG price tax like apple products
Too big, otherwise it would be perfect. Also, it has BFI which is good, you definitely want that, as it allows you to have the motion clarity of high fps without actually needing the game to be in high fps, it's perfect for emulation, games that are locked to 60fps, or games that you can't run at high framerate. Strobing (which is what BFI is basically) is what allowed CRTs to have perfectly clear motion.
From my experience anything above 60Hz is a meme. Buy a 4k TV or at least a monitor, especially since you don’t play FPS. You can kind of see when the framerate goes above 60 fps, but when I look at my old 1080p monitor it's a huge difference from 4k. You can of course say that 1440 is not 1080p, but it is not 4k either.
>From my experience anything above 60Hz is a meme
Get your eyes checked.
I'm convinced these people never changed the setting in Windows to change the refresh rate. They just plugged it in and said there's no difference
cope
You want at least like 120Hz so you can have a low lag display with VRR and BFI.
60Hz fixed refresh rate is fricking garbage for PC gaming because anything but 30fps or 60fps is going to have a bunch of judder, and V-Sync adds a shit ton of lag for fixed refresh rate displays (and the alternative is screen tear). Just at least get a 75Hz+ VRR monitor, that's the bare fricking minimum for decent PC gaming, even if you never a game above 60fps
If youre playing a single player game the lag from vsync is negligible
No, it's actually pretty substantial, it's usually a good 50ms or so, about as much as difference as Game Mode makes on most TVs. If you're used to laggy console garbage, maybe it's acceptable, but low lag gaming is a major positive for PC. You're just a turd world poorgay coping with your garbage display
You're a mentally ill frametroony
CRTs back in the day could support different refresh rates, had perfect motion clarity, and non-existant input.
A 60Hz fixed refresh rate display today can't output at arbitrary refresh rates (which creates major issues for emulation and fluctuating framerates), have horrible motion blur (with no BFI to alleviate it), and are extremely laggy due to fixed refresh-rate V-Sync.
Keep coping with your shit display you brown third world mutt
>support different refresh rates,
Not an achievement
>had perfect motion clarity,
Le buzzwords
>and non-existant input.
ESL drivel
It's very strange this tribalism cope from people who cheap out and buy shit tech. It makes a lot more sense though when you realize most posters on here are poor browns who can only afford cheap outdated garbage or underage morons who have to take whatever mommy buys them.
You will never be woman. You will never be white.
>t. diversity hire glowBlack person troony who makes 5k a week to shill gay rights on the internet
No one wants your tiny overpriced monitor. Frick off.
Your experience is shit, because there absolutely is a difference.
After 100+ frames, 60 frames looks janky as frick.
After 4k 1080p looks like a mobile game
Never said it didn't. Both things can be true at the same time.
That's why the best thing is 1440p 120+ Hz.
1440p is 1.6 the pixel count of 1080p and the performance demand isn't THAT high.
4k you start getting diminishing returns compared to the required performance increase to run 4 times the pixel count.
Performance troony
>Ganker doesn't understand gradeschool geometry
Embarrassing tbh
>pc peasants still playing on 1080p while consoles moved on to 4k
Get a 4k 60hz tv. 1440p is a meme, 75hz or 120hz is a meme
Find actual stats on the monitor and how fast it redraws things. Shitty monitors that are nominally 144hz often have a problem with shadows at that refresh rate.
I recently upgraded from 1080p to 1440p and it's pretty nice. Years ago I remember Ganker telling me 1440p was a meme resolution and just wait for 4k though and they were wrong since it's a noticeably difference. Now I'm curious about 4k but it's too demanding.
It just works
waste of money
Most emulators can increase the refresh rate beyond 60hz
OLEDs have superior flickering and superior burn-in, I sure hope you're not a fascist and that you can learn to like defects both in technology and people like me.
It's 2024 and all monitors are still dogshit in some capacity. IPS has 0 contrast and is therefore of course garbage, VA has serious black smearing issues and is slow, TN is well TN, and OLED forces countermeasures like pixel refresh and other crap to try and prevent burn in so that's trash as well. No matter which modern monitor you go with, you are compromising heavily in some area, some more than others. It has been like this for well over a decade now. Grim
Retvrn to tradition, white man
>Grim
MicroLED will save us. It is truly the future, we just need it to become mainstream. You get colors that are 99.99% like OLED, much better brightness, and no burn-in.
>MicroLED
This is a very, very long way out.
miniLED TVs and monitors are already somewhat widely available, and apparently the factories needed and manufacturing process is very similar, so the people behind the microLED industry are speculating that we could probably have microLED within the next 10 years.
https://www.microled-info.com/microled-industry-association-publishes-its-first-microled-industry-roadmap
>so the people behind the microLED industry are speculating that we could probably have microLED within the next 10 years.
Yeah, like I said, a very long way out.
OLED just became mainstream like last year or even just a couple of months ago. Expect OLED to stay here for the next 5 years MINIMUM until it becomes the main and most popular panel type. The great thing for monitor manufacturers is of course the risk of burn in, because you'll have to buy another monitor again. They won't be so eager to live this technology just yet.
QDLED
A lot of QDLED TVs are actually worse than OLED in many cases when it comes to burn-in.
>75hz
why would you get a monitor with a refresh rate that doesn't sync up with the framerates of any type of content?
>vidya: 30, 60, 120
>tv: 24
120hz is the correct refresh rate for the most types of content
I cannot understand why anyone would not want to have a 50 inch or bigger display for gaming in this day and age. It simply is good. If you are still using 32 inch or smaller bullshit. Stop everything you are doing because you are just fricking wrong.
If I had a 50 inch TV on my desk, I would have to physically move my head to look at minimaps in the top corners. What the frick would the point of that be? It would only make sense if I sat several feet away from the TV.
A desk? What the frick? Go buy yourself a lazyboy two. Its not 1994
"Help me back to my computer desk son, I fell on my zimmer frame, come and see my 32 inch 144mhz tiny little screen"
Are you writing these posts from your lazyboy with your controller?
Yes I'm on my phone
I'd actually dictating them using a python script on an old xbox kinnect plugged into a laptop. To post I say 'n*gger n*igger n*gger"
Back in my day we played asteroids in 140i 1200fps and it was the golden age
filthy casuals you dont play any competitive games
!Look at my computer desk son! It has storage areas for the paper from my printer and here is my special mousemat!! Here is where I sit looking at my tiny little monitor"
If you're going to upgrade to 4K, aim for an ultrawide 1440p with HDR+ instead. It's slightly less taxing than 4K while still looking great. A lot of emulated games get ultrawide mods too.
>t's slightly less taxing than 4K
MUCH less taxing.
>hdr
This is the thing I'm actually excited about, especially the higher the brightness above "sRGB white" you can get. I personally find good content that takes advantage of it more dazzling than higher resolution. And the best part is that, presumably, it doesn't actually increase the performance overhead of games that much, if at all. After all, games already render in a scene-referred space, so it's just a matter of mapping the colors to HDR space, and they already have to do that for SDR space.
But so much about how HDR is marketed is bullshit. An overwhelming number of displays on the market implement "HDR" as a glorified brightness level which is no different than manually turning up the brightness on an SDR display. And even the ones that can actually deliver the contrast required are either OLEDs that have burn-in and don't have much in the ways of a full screen max brightness, or are LCD panels where even a "staggeringly high" number of dimming zones is only a paltry 1000 or so, when you really need 30-80k before it starts getting acceptable.
>glorified brightness level which is no different than manually turning up the brightness on an SDR display
Nah, it's not that bad.
You just have to be aware it's going to be entry level. And even if it is considered poor HDR, it is still looking way better than SDR.
I'm replaying Cyberpunk right now for DLC, and boy does it make a difference.
You might overclock the 75hz monitor go get some extra refresh rate out of it. My 144hz monitor is overcooked to 169hz
Been using an OLED monitor for over a year now and it's been awesome. Sorry you're poor
see u in 3 years
So? I'll just buy a new one if it gets damaged. I don't display static images to deliberately damage it like this
>Been using an OLED monitor for over a year now and it's been awesome. Sorry you're poor
>old OLED TV from 8 years ago with no antiburn in measures
>Been using an OLED monitor for over a year now and it's been awesome. Sorry you're poor
Why do you have troony vids saved on your PC?
i like how the camera guy hides his face
"Look at my computer desk son! I store my commemorative plates from the Great War of Northern Aggression in it to. Back in my day we sat in front of these tiny little screen we called 'monitors' and rubbed mouse balls"
You're a frame troony that counts frames instead of playing games and counting pixels like me
>getting mad because other people don't use the same size of a monitor, with the same resolution and the same refresh rate as you personally
lamo
I cannot understand why anyone would not want to have a 50 inch or bigger display for gaming in this day and age. It simply is good. If you are still using 32 inch or smaller bullshit. Stop everything you are doing because you are just fricking wrong.
Is it worth investing in a 4K HDR television (TV) for gaming?
Investing in a 4K HDR TV for gaming can greatly enhance your gaming experience, especially if you have a gaming console that supports 4K HDR. The combination of increased resolution, vibrant colors, and improved contrast can make games more immersive and visually stunning. However, it's important to consider your gaming preferences and budget. If you primarily play older games or aren't particularly concerned about graphical fidelity, a regular 4K TV may be sufficient. On the other hand, if you enjoy the latest AAA titles and want the best possible visuals, a 4K HDR TV can provide a noticeable upgrade.
Dolby Vision is an enhanced form of HDR that can use 12-bit color, resulting in about 68 billion colors that create a dramatically richer, true-to-life image.
>muh computer desk with little monitor
I've got a dolby vision certified display and I think the only game i've played that supports it is need for speed heat.
cy.Auto Low Latency ModeYou can also enable the ALLM (Auto Low Latency Mode) option in the Settings>System>Device Settings menu. If this function is enabled the TV automatically switches the picture mode to Game when ALLM mode signal is received from the current HDMI source. TV switches back to previous picture mode setting when the ALLM signal is lost. So you won’t need to change the settings manually.If the ALLM option is enabled, picture mode can not be changed as long as ALLM signal is received from the current HDMI source. In order for the ALLM feature to be functional the related HDMI source setting should be set to Enhanced. You can change this setting from the Sources>Source Settings menu or from the Settings>System>Sources menu.
Why are you copypasting from websites and manuals? Are you a bot?
Still using my 64i monochrome amber monitor here.
>mindless consoomers pretending to know about displays again
Get at least 120+, although your 2060 will probably not be enough to push these frames.
I got pic a few months ago and it's been working great. If you want something more get a monitor with good HDR.
>Frametroony thinks GPUs dictate frames
>moron makes a moronic post
>le r word
Reddit
>no argument
Why don't you just frick off, moron?
Let's see a 2060 push 100+ frames in Cyberpunk at 1440p.
GPU resolution CPU frames
Don't talk about things you don't know, moron.
You first troony
I do know what I'm talking about, unlike you, moron.
You're absolutely fricking insane if you think a GPU doesn't matter for framerate. Or worse, a consoleBlack person pretending to be a PC user.
>You're absolutely fricking insane if you think a GPU doesn't matter for framerate.
>I was playing UT99 and Q3 at 100 FPS + 100hz on my Samsung SyncMaster
Frametrannies. Not even once.
Keep shitposting, consoleBlack person.
>2060
Now we know it was just cope all along.
is ultrawide worth it for productivity vs having several monitors?
For productivity? Maybe. But it blows for video games when the devs don't support it, because either you won't be able to use the resolution at all and you have to select a lower one, or there will be black bars.
But do black bars even matter when it will just turn into a 16:9 monitor?
That depends entirely on the person. I'd personally be annoyed with it every time I saw that my monitor wasn't being fully utilized, but other people might not give any fricks.
I have a 165hz monitor but other than using more resources is there a reason to use 165 over 144hz? Hell even 120?
130+ feels great. Less than that feels abhorrent .
more is smoother
144 all the way
play a game at 144 for a bit and go back to 60 and you'll fall off your gamer chair
1080p 144hz is enough. Stop reading so many ads.
1080p => 1440p is 15% ppi increase for 50% performance requirement.
But they're literally acting in that pic you media illiterate.
Anyone here with a 120Hz but using it on 60Hz mode instead? It just feels more natural and as computer ought to work.
75hz is pointless and anything at 144 would basically cost the same
27" monitors are a meme. Get the one 24.5" 1440p Dispaly.
Honestly I'd save for 4k, I just got a 1440p 27'' and the fact I have to use 125% is pissing on my cornflakes, specially on linux
>and the fact I have to use 125% is pissing on my cornflakes
Why do you have to use 125%?
150% is borderline unusable
100% I have to sit a little bit too close to the screen to read anything
125% is just about right, just that it's this odd thing that some software really has a problem with
You want 144hz if anything just for normal browsing
I have a 1440p 165 hz monitor which are very affordable currently and games don't require a 4090 to run.
Is there a reason to buy a super high refresh rate monitor if my PC isn't good enough to hit those framerates?
>Is there a reason to buy a super high refresh rate monitor
No. 4k or bust.
Any framerate above 60 will still be displayed and feel smoother. If you have a 120/144hz monitor, but your hardware can only run the game at let's say 80-90 FPS, then it will still look and feel smoother than 60 FPS+hz. Now whether this is worth it, is entirely up to you. You can sometimes get pretty good 120/144hz monitors for like $150 during sales.
>run the game at let's say 80-90 FPS, then it will still look and feel smoother than 60 FPS+hz
What if I'm pretty much stuck at 60-70?
>What if I'm pretty much stuck at 60-70?
Then there's no point, sorry.
I chose a 1440p monitor because of the advantages such as low latency, low input lag and so on. A 4K monitor with the same specs was considerably more expensive, as well as naturally making my computer unable to achieve the higher frame rates that are essential for low input lag.
I have an RTX3090, so I thought I had made the wrong decision, but playing some games made in Unreal Engine 5 I noticed that my GPU isn't even enough to maintain 60fps at 1440p, let alone 120fps or more, so today I don't even think about 4K anymore, even though it's objectively better obviously, the more resolution the better.
I have a 144 hz monitor.
You don't need one. The difference is minimal.
If your current monitor breaks or you need a new monitor for an ACTUAL reason, sure, get a 144 hz monitor but if you currently have a working monitor, there is no reason to "upgrade".
>The difference is minimal.
Get your eyes checked. The difference between 60hz and 144hz is like night and day.
higher refresh rate is better
Trans btw