hz = hertz, named after 19th century German physicist Heinrich Hertz.
a hertz measures cycles per second or frequency. It's the amount of time it takes for the bottom of the wave to reach the top of the wave in a wavelength. when they use it to describe computers it warps the definition somewhat because the pattern is a digital square wave rather than an analog sine wave. so the word kind of has two different meanings, one official and one colloquial
Its basically how fast the monitor refreshes the image. Think of it as a limiter to your frames per second. If your PC can run a game at 120 FPS and your monitor is only capable of 60hz then it will only ever display 60fps. You're basically wasting all those extra frames your hardware can do because your monitor can't even display them. But if you have a 144hz monitor then you can see all of the 120fps your rig can do
It is better to have high framerate, because even on 60hz it will refresh to the latest image, even if motion is still only in 60fps, giving a slight edge by refreshing 120 fps to an more recent frame than native 60fps
Correct answer. The best balance between resolution and framerate. You always want performance first, then resolution second so as long as the Graphics can support High/Ultra at the Framerate you desire.
Someone on Ganker told me this years ago and it's absolutely the truth.
I bought a 1440p/144hz 32in LG gaming monitor, and then a few years later, bought a 4K TV with a gaming mode (it can do 120Hz) with Dolby Vision for some Xbox games and HDR for pretty much everything else..
These two are side by side and, honestly, the monitor looks and runs better. And 32in is honestly as big as you need. I have to push it away from me so it's not bugging my eyes out. The 4K TV is off to the side and pushed even further away and it's only 49in
It's pretty simple. If you're sitting a few feet away from the screen and your game has details you want to see like reading text, 1440p/144hz all the way. If you're sitting several or more feet away from the screen then 4k regular TV is the way to go (make sure it can't burn in though!).
Yeah 's what I mean, now that I think about it I guess it's really good for most retro resolutions but I feel like whenever you use the monitor to watch stuff which oftentimes is going to be at 1080p or even when you might have to drop down the resolution to that in this era of unoptimised slop then it won't be so nice.
There are 4K 144 Hz monitors out there you know... You could even buy a 4K 240 Hz monitor if you'd like:
https://www.samsung.com/us/computing/monitors/gaming/32-odyssey-neo-g8-4k-uhd-240hz-1ms-curved-gaming-monitor-ls32bg852nnxgo/
https://rog.asus.com/kh/articles/monitors/the-rog-swift-oled-pg32ucdm-hits-the-sweet-spot-of-size-and-resolution/
if you're willing to spend like $800 more than 1440/144 or 4k/60 after GPU and CPU upgrades to actually drive that many pixels/second
It's literally under $1000 right now though:
https://www.amazon.com/SAMSUNG-Odyssey-FreeSync-Ultrawide-DisplayPort/dp/B09ZH3WM47
That's not a lot for a monitor these days, isn't it? I recall seeing 1440p monitors not that long ago that were way more expensive than that.
Depends on the game in question and the settings used I guess. 1080p60 can even be taxing for those cards in some "modern" games at maximum settings, next year it will probably be even worse. Maybe we should be buying up old 720p monitors.
4K 60fps is cheaper to buy and assemble a computer to run at.
1440p 144hz is the middle ground.
4K 144Hz is real expensive.
And what happens is you can have it in some games but the only get 90fps or 60 in other games.
Adaptive Sync is ass >need to play in Exclusive Fullscreen >more Motion Blur and Inverse Ghosting >depending on brand it disables some custom settings like color calibration
Just buy a 240Hz monitor and don't have to deal with this shit.
Looks good on my screen. I don't need to listen to some baldie ramble on about bullshit for an hour to know I'm enjoying it and it BTFOs every other screen I've tried.
ya C2 was the best purchase ive ever made
couples nice with console and pc, even xbox 360 games are improved on the screen. it makes everything look so good. not to mention 4k hdr blurays played on series x Holee shit they are nice
ive had mine for almost an year now and so far no issues except once channel switch to pc was buggy because xbox had hdr mode on
Oled is a meme in general. Sure samsung has made the tech better with qd oled but overall it's still full of compromises compared to cheaper traditional full array lcd tvs.
[...]
It's literally under $1000 right now though:
https://www.amazon.com/SAMSUNG-Odyssey-FreeSync-Ultrawide-DisplayPort/dp/B09ZH3WM47
That's not a lot for a monitor these days, isn't it? I recall seeing 1440p monitors not that long ago that were way more expensive than that.
>you dont deserve money
Okay okay, here's a much cheaper monitor that still is both 4K and at least 144 Hz:
https://www.amazon.com/Acer-XV282K-KVbmiipruzx-Agile-Splendor-FreeSync/dp/B0BGM54LT6
If you can track down the 28-inch model, it should be even cheaper.
My problem is that I have the money for literally whatever I want but I don't want to burn it on stuff that's not worth it. I don't want to be a beta tester or pay like 90% more for 10% better experience.
They way that manufacturers have mind broken consumers into thinking 4k is a logical step from 1080p is legit hilarious. Pretty sure the 4090 paypig edition can't even handle some modern games at that resolution. Then seeing >people getting double mind fucked into thinking you need 144hz+ @ 4k is even more comical. If you don't like videogames, don't play them. If you can't have fun without spending $10k on a linus tech tips endorsed setup then again you don't enjoy videogames, you enjoy consuming.
In general? 4K. But this is a video game board. Go with a faster fresh rate.
I have a basic 60Hz 4K monitor, but not because of games.
4k
Unless you're a manchild gaymer who uses his computer for literally nothing else
I agree with these. I defaulted to 4k/60 because of normal desktop stuff. I'm definitely not downgrading to 1440p for 144hz, and i'm not gong to jump into 4k/144hz until it's far more affordable. You can play video games at 4K/60 with a 6800XT or maybe 4070. It's actually obtainable and not insane.
Pico 4 and Quest 3 are unironically good headsets. If you want a top notch device, you go for Bigscreen Beyound. But top notch 4k screens are more expensive than that.
4k is not an option if you're poor, unless you don't mind playing at medium-low settings all the time.
As an owner of a1440p 180hz monitor however, I'd pick resolution over refresh rate. Competitive gaming is a meme, you don't really need more than 60hz.
if you have a 1440p monitor you dont have to worry about resolution retard. what are you gonna do, play at 1080p. I always turn my 4K monitor to 1440p and all settings to medium so I can max out fps. fps uber alles
4K/60 is a scam.
120fps is far more immersive than meme special effects. Nvidia likes to jack up the special effects and then pretend capping games at 60 fps is OK so the industry can keep selling shitty 60hz displays.
>Refresh rate is a MEME through and through. Only true answer is Higher Resolution
I've got a friend that plays Siege on a fucking Panasonic TV from the early 2000s on a console with a busted ass, cum covered controller. And still manages to rank Diamond League year after year
I'm just going to assume you mean "Full HD" because "HD" is 720p
It's a >just b urself
kind of a question, anon.
1080p will objectively look better on a 1080p monitor.
Chances are, you won't really notice the difference between 1080p native and 1080p stretched to 1440p unless you are looking for it but it'll be there.
1440p is a more "future-proof" version - you'll be able to play games at higher resolution at a later point if you upgrade your hardware.
You'll also be able to play older games on 1440p right now, and it'll be fine. So overall 1440p is a better investment than 1080p but if you are fine with playing on 1080p right now, don't feel the need to upgrade, and don't need more screen estate, you can keep using 1080p.
That being said, if you did mean an actual 720p monitor, it's time to upgrade, anon.
thank you, i am just looking for a monitor for uni, something to plug my laptop into
i already have a 2k 144hz monitor at home, it's pretty cool, but i find 1080p stretched rather blurry on it
if i decide to take it home from uni, i can use it as a second monitor to my 2k one, so both are nice and 27 inches
it would be nice to save some money on a full hd one instead of going for 2k, but on the other hand i more so browse the net than play stuff
If you're gonna be on 1440p, there's 240hz monitors now. Honestly I give very little fucks about 4K for a computer monitor. I watch media on a 4K TV but motion clarity and refresh rate are something I value a lot more.
>get 4k monitor >no games worth playing in 4k >only bugs in 4k >physics tied to fps >game movement tied to fps >character movement speed tied to fps
Just stick to 1080p 60fps. It's not like you have someone to sit with while you play or watch anything
4k by a mile. My next monitor is going to be an 80" 8k one, only retarded kids care about refresh rate, because if they actually cared about motion clarity they'd still be using 20 year old CRTs.
There are clear perceived benefits for going up to 90 Hz and above. It will not only affect your games - it'll improve every aspect of PC usage.
It also means that you don't have to be able to push the full 144 Hz to feel the benefits.
4K, meanwhile, does have its immediate benefits. But, speaking bluntly, if you have to ask this question, you don't really need any of the immediate ones.
4K is harder to drive, and unlike with the refresh rate, if you aren't utilizing the display resolution fully, you are getting a worse experience than you would've received otherwise (i.e. by having a lower-resolution monitor and running it at native resolution).
In the context of budget restrictions, I would imagine you are also restricted in your other hardware, GPU included. So there's a good chance you won't be getting full mileage out of either, so in this scenario going for the higher refresh rate option makes more sense.
I don't know what kind of monitor are you currently on but the general approach is >Upgrading from 1080p@60 Hz to either 1080p@120+ Hz or 1440p@60 is about equal in terms of the overhead. And it "makes sense" about equally.
going for 120+ Hz would result in all the aforementioned benefits. 1440p would improve the picture noticeably enough, and would allow to more reasonably use bigger-diagonal monitors, such as 32" >Going from either to 1440p@120Hz is a great stop-gap before getting an expensive high-end high refresh rate 4K monitor and the necessary hardware to drive it.
>it'll improve every aspect of PC usage.
No, it won't.
Wow, your mouse cursor will have more afterimages on your garbage sample and hold display. Refresh rate makes ZERO difference to anything but gaming. Browsing? That's a static page. Watching a video? All that shit is locked to 60hz.
Higher resolution means more screen real estate, assuming you buy an appropriately sized monitor. For 4k, that's 40" minimum. Nothing beats having more resolution. You can throw your secondary monitor in the trash, you'll never need it again.
So you genuinely don't have one, I see.
I don't really get when people assume a strong position regarding topics they don't understand but I digress.
General PC usage in the modern day and age mostly involves using a mouse and scrolling pages on the Internet.
Those are very "analogous" kinds of inputs, meaning they are very precise and continuous. You, as a human, notice responsiveness in these kinds of inputs much better than in the ones that are more abstracted away from you. I.e. people are less sensitive to more responsive button presses than to more responsive mouse movement.
It also helps, that the results of those inputs are very clear and easy to notice. Moving around the camera in an FPS game produces a lot of change to the perceived image while moving around the mouse on the desktop only moves the cursor, making it easy to see the improvement.
To put it simply, the biggest WOW factor one gets from upgrading to a high refresh rate monitor comes not from games but from desktop usage. The difference will be obvious, immediate, and massive.
Meanwhile, FPS games might actually make one doubt the difference for a moment until they try seeing it at 60 Hz again.
There is some objectivity to those factors as well. A more responsive mouse cursor is more precise and easier to use.
Smoother page scrolling helps you process information better as you scroll. There's a good reason most phones are 90+ Hz nowadays when the only thing people do on them is scrolling shit.
Also, take it easier with your CRT Kool-Aid. Unlike you, I have actually grown up using multiple of those and had multiple encounters with them in recent years.
No, they are not magic.
No, modern LCDs do not leave cursor trails.
Yes, you are stupid.
Got it, you're a mentally ill zoomer who "doomscrolls" all day. I get that you don't have much of a choice because you're probably using twitter or some other infinite dogshit service, but for actual webpages, there is very little scrolling. You scroll to where you want to read, and then you read. You aren't fiddling with your mouse, you aren't rocking some gay 1px scroll up and down like an autistic clown, it's a static fucking page filled with text. If you had a screen bigger than a postage stamp, maybe you'd be able to read things without having to scroll every other line.
I've got an old 120hz 1080p screen, I didn't care much for it. Is it an improvement? Sure. If I had another grand to blow on my monitor, maybe I would've gone for a 4k144 instead. Given the choice between a smaller high refresh display, and a larger low refresh one, I went with the sane choice, because being able to see 4x as much text at once is pretty useful, you only need to scroll 1/4th as much. Large format images don't need downscaling to fit, I no longer need a vertical monitor for some formats.
4 weeks ago
Anonymous
>for actual webpages, there is very little scrolling
Examples of actual webpages? I notice that this thread, for instance, fails your criteria for being an "actual webpage".
4 weeks ago
Anonymous
I've got to be honest man, I don't think you understand the function of scrolling. You scroll, which is a very fast, sub 100ms action to get to where you want to be on the page, and then you spend a couple of seconds reading it. Do you go line by line? When you read a book, do you get frustrated at how the evil author has forced you to "chunk" every action by turning a page instead of leaving one continuous stream of text that you move past your eyes, instead of moving your eyes to follow?
Lmao what is this drivel. Nobody reads while scrolling. Any decent smooth scrolling implementation will look good on any refresh rate and it's literally something that takes like no time at all. It's literally same shit as with the mouse cursor - you play with it for 10 seconds and then you stop caring about it.
144hz every time. 4K is an auxiliary feature, it's harder to get good performance for and really just means I have more visual clarity, but it doesn't mean a goddamn thing for gameplay besides making out details in the far distance better. If you're buying a computer monitor solely for 4K experience without thinking of additional features, you're being ripped off.
I'll take framerate over resolution but TAA, DLSS, and FSR look like ass below 4k. It sucks that most games now use temporal AA and don't bother to offer a way to downsample.
I have both a 4k60 and a 1440p240 monitor and 95% of the time I just play on the 4k one, but that's just me. Realistically though both are fine, so just get whichever is in your price range.
VA has horrible blur the lower hz it has, and still noticable on 144hz especially on black objects. Also watch out for black levels on HDR screens, some look washed out and grey
4k, though I'd go for 144Hz if you're a competitive gaymer which you probably are as a poorfag.
I'm not too poor to justify both, but I can't justify shit HDR and I can't justify anything but IPS either so I'm stuck on my 1080p60 TV waiting for microLED which I am too poor for currently.
The Cooler Master GP27U almost got me but apparently it's got some bad flickering and response time.
This sucks, man.
>I can't justify anything but IPS either
Samsung's VA is very good, actually.
Odyssey Neo G7 is as close to "perfect" for a non-OLED monitor as it currently gets.
Is it? I haven't given VA serious thought in a while but as far as I remember the bad side is mostly ghosting, being blurry and all that. If the downsides are mitigated and colours are good I might go for it.
>I can afford to buy food but I just choose to dumpster dive
I've had this little nigga for 7 years, I'm not actively buying a new poorfag monitor.
>Is it? I haven't given VA serious thought in a while but as far as I remember the bad side is mostly ghosting, being blurry and all that. If the downsides are mitigated and colours are good I might go for it.
I've been using one for a year and I absolutely love it, however, I was 100% aware of what I was getting into. Your mileage might vary. >The colors are fundamentally great. Not the "professional artist" tier, but more than enough for media consumption. >It comes fairly well-calibrated as well (I have a hardware color calibrator, so I could check). Not perfect, but decent enough. >The curve is there and it's not really doing anything useful, but you'll stop caring about it in ten minutes and stop noticing it in a few months. >The panel being VA still has some color-smearing implications. However, the low response time ranges are much narrower than on other VA panels, so it's exceedingly rare (but not impossible) to encounter the typical VA smear outside of test scenarios. >The HDR performance is very good but obviously not OLED good. Fully calibrating HDR mode is not very realistic but you can get reasonably close to it being accurate enough >Local dimming will add some response time delay. It's not noticeable 99.9% of the time in non-test scenarios, but you might notice it fairly clearly in panning scenes in movies, for example. >Connecting via Displayport will force the monitor use DSC at all times, meaning you can't enable driver features that conflict with it. For Nvidia, that's DLDSR or integral scaling. If you connect via HDMI 2.1 _and set the refresh rate to 120 Hz on the monitor_ you can use those features >Samsung's QA is shit. Ordering one is a lottery. I got lucky.
So, the bottom line, it's awesome but far from flawless. If you know what to expect - it's an amazing monitor. But be aware of what you are buying. I highly recommend testing one in-person before buying, if you have that opportunity.
Anon, you might be retarded, but I have literally mentioned that I have a hardware calibrator.
>Samsung shills on full force
All i'm saying is that people should go to a store and compare images of simmilar screens, and watch for backlight bleeding, black levels and colour contrast. Do your own research and you will see what I mean.
4 weeks ago
Anonymous
>list mostly negatives >be called a shill
You should probably read the posts you are responding to.
4 weeks ago
Anonymous
>noo dont call out the bads in my product noo
Stfu shill
4 weeks ago
Anonymous
>actual retard
okay then, do your thing
4 weeks ago
Anonymous
Yet, cannot disprove my claims of image quality because they are true
4 weeks ago
Anonymous
Anon you are literally retarded.
I repeat, read the post you are responding to.
Neo G7 has good image quality out of the box.
My claim is that I actually FUCKING CALIBRATED it and I know the difference, while your claim is >I saw it in a dream
essentially.
You had no argument at the moment you made your retarded statement, which was pointed out.
Next thing, the Neo G7 is an HDR FALD display with 1196 backlight zones. It can't have backlight bleeding by definition.
And it has some of the best contrast ratios for non-OLED displays out there because it can fully shut the backlight off. You'll need an OLED display to actually realize that the blacks Neo G7 have are not "true blacks".
We are talking here about a premium monitor that costs over $1000, not the cheap $20 crap you saw in a garbage bin once and made all of your judgments on.
You have absolutely no idea what are you talking about. Your experience, assuming you actually had any, cannot be applied here. You are out of your line. You are unfit to participate in the conversation.
And yet you keep pushing it and spitting more shit you have absolutely zero understanding of.
Use your head sometimes, for fuck's sake.
4 weeks ago
Anonymous
>shill
People who use that . and "goy" are literally poor cunts who slate others purchases.
yes I do own a G8 , but I also own a 1440p Asus and a 1440p MSI monitor so am a shill for them too?
I was looking for a good example of smearing during scrolling on Neo G7, didn't find the one I was looking for (there was a nice and simple table with colors and color codes in it, it would make it really easy to see what colors is a given monitor struggling with) but here's a simple example: >https://ludens.cl/photo/montest.html
on this page, the pic related part has noticeable shimmer when scrolling up and down.
Everything else looks perfectly fine.
On a typical VA monitor, about 30% of the grayscale section examples will have some issues with them, and the black rectangles themselves will visibly jiggle around while scrolling.
I was looking for a good example of smearing during scrolling on Neo G7, didn't find the one I was looking for (there was a nice and simple table with colors and color codes in it, it would make it really easy to see what colors is a given monitor struggling with) but here's a simple example: >https://ludens.cl/photo/montest.html
on this page, the pic related part has noticeable shimmer when scrolling up and down.
Everything else looks perfectly fine.
On a typical VA monitor, about 30% of the grayscale section examples will have some issues with them, and the black rectangles themselves will visibly jiggle around while scrolling.
Intredasting. I'm still not too keen on the curve, but I'll keep it in mind.
Higher fps is more useful than higher resolution. Raising your resolution is one of the biggest drains on performance. Something like 1440p or as its called "2k" is more than great already. 4k is overkill and as I said kills performance. After a point you don't even notice the increase in resolution so it doesn't really matter. You will notice the increase in FPS though. And a 144hz monitor will allow you to see that increase
The truth is they're both memes. They're just for people who get a rush out seeing big number go up. Just get something that can display 1080p at 60fps/hz and you're good if you're on a budget. Everything else is just an excess
>thinks that because a year goes passed on a calendar that everything is different
If burn in wasn't a thing then why do OLEDs have burn in protection measures?
This is a problem im having.
About to inherit a bit of money.
Never had anything more than this dell 1440p 60hz
Was looking at 4k but there's UHD and DCI, do games even support DCI?
I would probably rather get a 1440p Monitor at 144hz, but i've never had a high hz monitor.
I also do art, so a high res would be better, can't really mix a 144hz 1440p and a 4k 60hz can i?
You have good eyesight, then, I envy you.
I'm running 150% on a 4K 32" myself. I can manage 100% but it's uncomfortable, so for me, there's no real difference between 1440p 32" and 4K 32"
It's funny how after years of using 24" 1080p monitors, everyone somehow thinks that using what is essentially no different than a 16" 1080p monitor is going to be fine. I went down to 43" for my 4k monitor, which is a little smaller than ideal, but I just sit a few inches closer. For 32", you've got to sit a lot closer for a usable image.
>dumb question but is there any way to prevent dead pixels?
Pixels rarely die during the life of a given monitor. Almost all monitors that have dead pixels were shipped with them.
So make sure you have checked your monitor for them as soon as you get it but after that, no additional actions are needed. >also no harm in leaving it permanently on (idle/sleep mode for the night)?
I would assume no. The idle mode does turn the relevant parts of the screen off so it's not actually doing anything. Aside from scenarios that involve stuff like the wiring in your house being shit or a bolt of lightning hitting in, there should be no real threat in keeping them on and idle.
You might want to avoid keeping it permanently displaying something because LCDs are still subject to getting burn-in, even if nowhere to the degree OLEDs are.
>as soon as you get it
yeah I did that with my new monitor and no deads. my nearly 10 year old monitor has one but I'm very certain it didn't ship with it, it could've and I just didn't pay attention until I looked for it but one dead pixel is nothing >because LCDs are still subject to getting burn-in, even if nowhere to the degree OLEDs are.
oh okay I didn't know that, the worst thing that I have is the tab bar at the top of my browser and I assume that can't hurt much assuming it goes off every day?
thanks overall 🙂
>but I'm very certain it didn't ship with it
It's still possible, just very uncommon. You got unlucky, anon. >the worst thing that I have is the tab bar at the top of my browser and I assume that can't hurt much assuming it goes off every day?
Yes, you'll be fine. It normally takes some deliberate misuse to get visible burn-in on an LCD so as long as you are using it in a reasonable fashion (i.e. don't leave it on displaying a static bright image for weeks) you won't really need to do anything to avoid it.
>Why dont you
Why don't you shut the fuck up and fuck off then?
You are objectively fucking wrong, and I bet that's not the first time you are saying stupid shit ITT.
I'm not here to educate your stupid ass on things that you should at least fucking google before saying.
What next, do I have to prove to you that you'll die if you stop drinking water?
You have to be at least 18 to post on this website, you illiterate chimpanzee.
I've definitely had burn-in on LCDs, just saying. It doesn't happen much any more but my TV from 15 years ago's picture quality was fucking destroyed by staying on one of those TV guide channels when they were still around. I remember specifically playing MGS2 HD and it still had lines and the cable company logo in the image. Maybe it doesn't qualify as "burn in" because that specific thing will go away, but it was basically just replacing old shit with new shit, even if it wasn't static. There were always weird lines and blurry overlays forever in it, the TV was garbage. I won't get an OLED until it's truly fixed as if it were a modern LCD.
I just joined this thread, if you actually read my post I'm trying to make a point that it DID exist, but doesn't much any more, which is why I'm not getting OLED until it's fixed. There will be a day when OLED doesn't burn in.
I'm not trying to shit on LCD, I'm actually defending it.
Don't reply to that retard.
He's too dumb to type >can LCDs get burn-in
in google and read the result.
He's an example of an utterly computer-illiterate retard who somehow managed to remain this ignorant and stupid in the day and age when even 80-yo grannies have enough technical know-how to do a google search.
This monkey thinks LCDs get burn in.
This monkey doesnt know the difference between plasma and LCD.
This monkey thinks retention is burn in. Even retention is difficult to get on an LCD. >BUUUUUUUUUUUT MUH EXTREME CASES LIKE 300 NITS OF BRIGHTNESS FOR 24HOURS A DAY for 365 DAYS A YEAR FOR 10 YEARS WILL SURELY BURN IN HAAAAH GOTTEM!!!!!!!!!!!!!!!!!!
This dude is the tranny that says 4K gives higher FOV in games
I just joined this thread, if you actually read my post I'm trying to make a point that it DID exist, but doesn't much any more, which is why I'm not getting OLED until it's fixed. There will be a day when OLED doesn't burn in.
I'm not trying to shit on LCD, I'm actually defending it.
All screens will burn in if you use them long enough with static content (particularly if blasting brightness). The real difference is whether that happens over a timescale where it matters.
I just joined this thread, if you actually read my post I'm trying to make a point that it DID exist, but doesn't much any more, which is why I'm not getting OLED until it's fixed. There will be a day when OLED doesn't burn in.
I'm not trying to shit on LCD, I'm actually defending it.
OLED burn in won't be fixed before microLED becomes viable I reckon and by that time lol OLED.
32 inch.
1440p.
144hz (or 165hz).
For me it's pure goldilocks zone.
Side by side windows are perfectly large for doing work tasks. Full screen feels great for immersive gaming. Great size for films.
The bump to 4k would be awesome but at that rate I rather put the cost difference into something that would more complete the setup like a better sound system or some kind of ergonomics for comfier desktop time.
1440p is gay and retarded and i've never used it or even seen it in real life, and i haven't touched a monitor that has it and it's gay and a poor man's 4k but not 4k
4k is the MINIMUM monitor resolution, 5-8k is best
1440p made some sense over a decade ago when you wanted to spend a little more for a monitor that was a bit bigger. It's gay and retarded now you can get 4k for sub 300 bucks, though.
27" monitor is big enough. Going much bigger would be pretty awkward, I feel like I would need to sit further away from the screen. And 1440p is common 27" resolution
What are you running that you can't play at 4K/144 with a 4090/7800x3d?
I don't really understand this. TPU shows that it gets 125fps 1% LOW and 170fps AVG with that setup. That seems quite capable and worthy of saying "it runs 4k 144hz". It literally does it.
Bros what do you think the best possible GPU for 1080p is ? I wanna splurge on something for muh graphix and muh goytracing but I don't feel like upgrading all my monitors to 1440p
60FPS is fine for any video game. 120 if you're an online twitch shooter. Smooth frame rate and high level of detail is better than high framerate and ugly graphics.
I use an ultrawide 2K resolution monitor, it does up to 180 FPS, some games I play 180, others 144, others 100, I don't mind 60FPS on single player games. Frame generation is becoming a thing for AMD as well, which can double framerate pretty effortlessly. What I'm saying is I can be more flexible with a 2K monitor setup than a 4K. I've got a $500 graphics card that can run everything fine, if I were running 4K I'd need a $1000-1600 card for comparable performance.
>If
I have no clue what hz means
HohenZollern, the rightful ruling dynasty of Germany
144hz. 4K is still a meme and only really good for certain edge cases.
Heinrich Ziegler, competed in the fencing Olympics back in 1912.
>which is better
A 1080p panasonic plasma running at 60hz.
The classic Megadrive gave Herzog Zwei.
hz = hertz, named after 19th century German physicist Heinrich Hertz.
a hertz measures cycles per second or frequency. It's the amount of time it takes for the bottom of the wave to reach the top of the wave in a wavelength. when they use it to describe computers it warps the definition somewhat because the pattern is a digital square wave rather than an analog sine wave. so the word kind of has two different meanings, one official and one colloquial
beans
Its basically how fast the monitor refreshes the image. Think of it as a limiter to your frames per second. If your PC can run a game at 120 FPS and your monitor is only capable of 60hz then it will only ever display 60fps. You're basically wasting all those extra frames your hardware can do because your monitor can't even display them. But if you have a 144hz monitor then you can see all of the 120fps your rig can do
It is better to have high framerate, because even on 60hz it will refresh to the latest image, even if motion is still only in 60fps, giving a slight edge by refreshing 120 fps to an more recent frame than native 60fps
fps
1440p 144hz
even for slow-paced stuff?
Why the fuck are you asking if 144hz is important if you're going to play slow paced stuff?
ANSWER MY QUESTIONS
He answered you retard. If it's slow paced it doesn't fucking matter as much
abso fucking lutely
the duality
Slow paced stuff to me usually means mouse-driven UIs and a lot of screen scrolling, which 144hz absolutely benefits.
>What is variable framerate
General Use: 4k. Playing a game: 144hz.
1440p isn't small m8
Correct answer. The best balance between resolution and framerate. You always want performance first, then resolution second so as long as the Graphics can support High/Ultra at the Framerate you desire.
Someone on Ganker told me this years ago and it's absolutely the truth.
I bought a 1440p/144hz 32in LG gaming monitor, and then a few years later, bought a 4K TV with a gaming mode (it can do 120Hz) with Dolby Vision for some Xbox games and HDR for pretty much everything else..
These two are side by side and, honestly, the monitor looks and runs better. And 32in is honestly as big as you need. I have to push it away from me so it's not bugging my eyes out. The 4K TV is off to the side and pushed even further away and it's only 49in
It's pretty simple. If you're sitting a few feet away from the screen and your game has details you want to see like reading text, 1440p/144hz all the way. If you're sitting several or more feet away from the screen then 4k regular TV is the way to go (make sure it can't burn in though!).
this is the golden ratio right now
Aren't you bound to get shit scaling on all kinds of stuff?
t. clueless
What scaling? You are supposed to play games native resolution for best clarity of image
Yeah 's what I mean, now that I think about it I guess it's really good for most retro resolutions but I feel like whenever you use the monitor to watch stuff which oftentimes is going to be at 1080p or even when you might have to drop down the resolution to that in this era of unoptimised slop then it won't be so nice.
If budget is even a remote concern, you should just ignore 4K. Resolution still ruins performance more than anything else.
144hz, 1080p and 4k basically look the same a few feet away
There are 4K 144 Hz monitors out there you know... You could even buy a 4K 240 Hz monitor if you'd like:
https://www.samsung.com/us/computing/monitors/gaming/32-odyssey-neo-g8-4k-uhd-240hz-1ms-curved-gaming-monitor-ls32bg852nnxgo/
https://rog.asus.com/kh/articles/monitors/the-rog-swift-oled-pg32ucdm-hits-the-sweet-spot-of-size-and-resolution/
read the fucking post
It's literally under $1000 right now though:
https://www.amazon.com/SAMSUNG-Odyssey-FreeSync-Ultrawide-DisplayPort/dp/B09ZH3WM47
That's not a lot for a monitor these days, isn't it? I recall seeing 1440p monitors not that long ago that were way more expensive than that.
I'm talking about the computer upgrades. you aren't driving 4k 144 with anything less than a 7900xtx or a 4090
Depends on the game in question and the settings used I guess. 1080p60 can even be taxing for those cards in some "modern" games at maximum settings, next year it will probably be even worse. Maybe we should be buying up old 720p monitors.
All of that bloated AAA stuff is stupid though.
>Samsung
Don't. Most of their monitors get dead pixels or some other bullshit all the time, save yourself the headaches.
>curved
Can someone explain what's the point?
if you're willing to spend like $800 more than 1440/144 or 4k/60 after GPU and CPU upgrades to actually drive that many pixels/second
I have the Neo g8. its good.
Just buy an oled panel if you niggah are gonna start going crazy
4K 60fps is cheaper to buy and assemble a computer to run at.
1440p 144hz is the middle ground.
4K 144Hz is real expensive.
And what happens is you can have it in some games but the only get 90fps or 60 in other games.
Not even a 4090 can reach 4K+144hz
>Not even a 4090 can reach 4K+144hz
My 4090 does just fine.
Do retards like that guy you replied to not know of Adaptive Sync?
Monitor Refresh isnt static anymore. What a monkey
You are very cool and smart, anon.
I think you should get a tripcode so everyone can know it is you.
Let me guess you're a caveman that see the refreshrate and think 144hz? so I must have 144fps at all times?
I can't be sure you are the same anon because you don't have a tripcode, so I'm not going to respond to this question.
Adaptive Sync is ass
>need to play in Exclusive Fullscreen
>more Motion Blur and Inverse Ghosting
>depending on brand it disables some custom settings like color calibration
Just buy a 240Hz monitor and don't have to deal with this shit.
None of that is true
What? Are you retarded?
Save your shekels and get both. Once I got my LG C2 I wasn't happy with the 144hz 1440p monitor I had bought and the cheaper 4k Samsung tv.
>Once I got my LG C2
Ya got memed.
Looks good on my screen. I don't need to listen to some baldie ramble on about bullshit for an hour to know I'm enjoying it and it BTFOs every other screen I've tried.
>ramble on about bullshit for an hour to know I'm enjoying it and it BTFOs every other screen I've tried.
It doesn't.
?si=WcDpGJJFsDMemuPk&t=873
Enjoy your lag and shit SDR
holy fuck lmao, i always felt bad that the c2 came out a few months after i got my c1 but now im glad
ya C2 was the best purchase ive ever made
couples nice with console and pc, even xbox 360 games are improved on the screen. it makes everything look so good. not to mention 4k hdr blurays played on series x Holee shit they are nice
ive had mine for almost an year now and so far no issues except once channel switch to pc was buggy because xbox had hdr mode on
Oled is a meme in general. Sure samsung has made the tech better with qd oled but overall it's still full of compromises compared to cheaper traditional full array lcd tvs.
refresh rate will always be superior to resolution even 720p144hz is better than 4k60hz
It doesn't matter what resolution you pick, anything below 75hz looks atrocious. I'd rather have 900p@144hz than 8k@60hz.
>4k 60hz
Ewww.
>1440p 144hz/1080p 144hz
Based.
Based.
you dont deserve money
>you dont deserve money
Okay okay, here's a much cheaper monitor that still is both 4K and at least 144 Hz:
https://www.amazon.com/Acer-XV282K-KVbmiipruzx-Agile-Splendor-FreeSync/dp/B0BGM54LT6
If you can track down the 28-inch model, it should be even cheaper.
you deserve money
Why is 1440p/144hz recommended so much? Is it just people who play FPS games? Is it still better for strategy and RPG?
If you don't have [insert current top tier GPU here] you shouldn't bother with 4K
since i have both i would pick 4k
If you are too poor for a monitor you are definitely too poor for the GPU to exploit all the features.
My problem is that I have the money for literally whatever I want but I don't want to burn it on stuff that's not worth it. I don't want to be a beta tester or pay like 90% more for 10% better experience.
They way that manufacturers have mind broken consumers into thinking 4k is a logical step from 1080p is legit hilarious. Pretty sure the 4090 paypig edition can't even handle some modern games at that resolution. Then seeing >people getting double mind fucked into thinking you need 144hz+ @ 4k is even more comical. If you don't like videogames, don't play them. If you can't have fun without spending $10k on a linus tech tips endorsed setup then again you don't enjoy videogames, you enjoy consuming.
It's fun to take 8K screenshots of games that I'm replaying and turn them into wallpapers.
monitor 2k 144Hz
TV 4k 60Hz
I prefer 4K 60Hz personally, especially for non-gaming
I agree with these. I defaulted to 4k/60 because of normal desktop stuff. I'm definitely not downgrading to 1440p for 144hz, and i'm not gong to jump into 4k/144hz until it's far more affordable. You can play video games at 4K/60 with a 6800XT or maybe 4070. It's actually obtainable and not insane.
In general? 4K. But this is a video game board. Go with a faster fresh rate.
I have a basic 60Hz 4K monitor, but not because of games.
4k
Unless you're a manchild gaymer who uses his computer for literally nothing else
>bothering with monitors when VR exists
Isn't good VR like a $1,000+ investment though?
it costs less than average 4k monitor nowadays
no it definitely doesn't unless we're talking about something seriously lame like Meta headsets
Pico 4 and Quest 3 are unironically good headsets. If you want a top notch device, you go for Bigscreen Beyound. But top notch 4k screens are more expensive than that.
144hz 1080p
4k
4k is not an option if you're poor, unless you don't mind playing at medium-low settings all the time.
As an owner of a1440p 180hz monitor however, I'd pick resolution over refresh rate. Competitive gaming is a meme, you don't really need more than 60hz.
if you have a 1440p monitor you dont have to worry about resolution retard. what are you gonna do, play at 1080p. I always turn my 4K monitor to 1440p and all settings to medium so I can max out fps. fps uber alles
Case and point - poorfag with a 4k monitor.
144hz is not a meme. Anything above that is. 120hz is the new 60hz nowadays.
4K/60 is a scam.
120fps is far more immersive than meme special effects. Nvidia likes to jack up the special effects and then pretend capping games at 60 fps is OK so the industry can keep selling shitty 60hz displays.
4K is stupid for gaming. 1440p has most of the benefits (integer scaling with 240p, 480p, and 720p content) along with being a lot easier to drive.
>Refresh rate is a MEME through and through. Only true answer is Higher Resolution
I've got a friend that plays Siege on a fucking Panasonic TV from the early 2000s on a console with a busted ass, cum covered controller. And still manages to rank Diamond League year after year
What's better, playing games scaled to hd on a 2k monitor, or using a 27 inch HD monitor for everyday use?
bumping this
>hd on a 2k monitor
>scale 1080p to 1080p
I'm just going to assume you mean "Full HD" because "HD" is 720p
It's a
>just b urself
kind of a question, anon.
1080p will objectively look better on a 1080p monitor.
Chances are, you won't really notice the difference between 1080p native and 1080p stretched to 1440p unless you are looking for it but it'll be there.
1440p is a more "future-proof" version - you'll be able to play games at higher resolution at a later point if you upgrade your hardware.
You'll also be able to play older games on 1440p right now, and it'll be fine. So overall 1440p is a better investment than 1080p but if you are fine with playing on 1080p right now, don't feel the need to upgrade, and don't need more screen estate, you can keep using 1080p.
That being said, if you did mean an actual 720p monitor, it's time to upgrade, anon.
thank you, i am just looking for a monitor for uni, something to plug my laptop into
i already have a 2k 144hz monitor at home, it's pretty cool, but i find 1080p stretched rather blurry on it
if i decide to take it home from uni, i can use it as a second monitor to my 2k one, so both are nice and 27 inches
it would be nice to save some money on a full hd one instead of going for 2k, but on the other hand i more so browse the net than play stuff
If you're gonna be on 1440p, there's 240hz monitors now. Honestly I give very little fucks about 4K for a computer monitor. I watch media on a 4K TV but motion clarity and refresh rate are something I value a lot more.
144 hz is only viable if you play older games. In new games you're forced to keep upgrading the CPU.
>get 4k monitor
>no games worth playing in 4k
>only bugs in 4k
>physics tied to fps
>game movement tied to fps
>character movement speed tied to fps
Just stick to 1080p 60fps. It's not like you have someone to sit with while you play or watch anything
I have a 32", 4k, 60hz, VA samsung monitor bought for 280 euros. cry about it.
basado
>va
Samsung's VA is pretty good, actually.
Assuming anon has one of the newer ones.
I have a 1440, 165 hz monitor.
4k high refresh rate was too expensive and you're also running into needing a better gpu to support it.
4k by a mile. My next monitor is going to be an 80" 8k one, only retarded kids care about refresh rate, because if they actually cared about motion clarity they'd still be using 20 year old CRTs.
There are clear perceived benefits for going up to 90 Hz and above. It will not only affect your games - it'll improve every aspect of PC usage.
It also means that you don't have to be able to push the full 144 Hz to feel the benefits.
4K, meanwhile, does have its immediate benefits. But, speaking bluntly, if you have to ask this question, you don't really need any of the immediate ones.
4K is harder to drive, and unlike with the refresh rate, if you aren't utilizing the display resolution fully, you are getting a worse experience than you would've received otherwise (i.e. by having a lower-resolution monitor and running it at native resolution).
In the context of budget restrictions, I would imagine you are also restricted in your other hardware, GPU included. So there's a good chance you won't be getting full mileage out of either, so in this scenario going for the higher refresh rate option makes more sense.
I don't know what kind of monitor are you currently on but the general approach is
>Upgrading from 1080p@60 Hz to either 1080p@120+ Hz or 1440p@60 is about equal in terms of the overhead. And it "makes sense" about equally.
going for 120+ Hz would result in all the aforementioned benefits. 1440p would improve the picture noticeably enough, and would allow to more reasonably use bigger-diagonal monitors, such as 32"
>Going from either to 1440p@120Hz is a great stop-gap before getting an expensive high-end high refresh rate 4K monitor and the necessary hardware to drive it.
>it'll improve every aspect of PC usage.
No, it won't.
Wow, your mouse cursor will have more afterimages on your garbage sample and hold display. Refresh rate makes ZERO difference to anything but gaming. Browsing? That's a static page. Watching a video? All that shit is locked to 60hz.
Higher resolution means more screen real estate, assuming you buy an appropriately sized monitor. For 4k, that's 40" minimum. Nothing beats having more resolution. You can throw your secondary monitor in the trash, you'll never need it again.
>Refresh rate makes ZERO difference to anything but gaming.
Tell me how I know you don't have a high refresh rate monitor.
Go on, explain how it improves reading a webpage. Are they doing video ads at 144hz now or something?
So you genuinely don't have one, I see.
I don't really get when people assume a strong position regarding topics they don't understand but I digress.
General PC usage in the modern day and age mostly involves using a mouse and scrolling pages on the Internet.
Those are very "analogous" kinds of inputs, meaning they are very precise and continuous. You, as a human, notice responsiveness in these kinds of inputs much better than in the ones that are more abstracted away from you. I.e. people are less sensitive to more responsive button presses than to more responsive mouse movement.
It also helps, that the results of those inputs are very clear and easy to notice. Moving around the camera in an FPS game produces a lot of change to the perceived image while moving around the mouse on the desktop only moves the cursor, making it easy to see the improvement.
To put it simply, the biggest WOW factor one gets from upgrading to a high refresh rate monitor comes not from games but from desktop usage. The difference will be obvious, immediate, and massive.
Meanwhile, FPS games might actually make one doubt the difference for a moment until they try seeing it at 60 Hz again.
There is some objectivity to those factors as well. A more responsive mouse cursor is more precise and easier to use.
Smoother page scrolling helps you process information better as you scroll. There's a good reason most phones are 90+ Hz nowadays when the only thing people do on them is scrolling shit.
Also, take it easier with your CRT Kool-Aid. Unlike you, I have actually grown up using multiple of those and had multiple encounters with them in recent years.
No, they are not magic.
No, modern LCDs do not leave cursor trails.
Yes, you are stupid.
Got it, you're a mentally ill zoomer who "doomscrolls" all day. I get that you don't have much of a choice because you're probably using twitter or some other infinite dogshit service, but for actual webpages, there is very little scrolling. You scroll to where you want to read, and then you read. You aren't fiddling with your mouse, you aren't rocking some gay 1px scroll up and down like an autistic clown, it's a static fucking page filled with text. If you had a screen bigger than a postage stamp, maybe you'd be able to read things without having to scroll every other line.
I've got an old 120hz 1080p screen, I didn't care much for it. Is it an improvement? Sure. If I had another grand to blow on my monitor, maybe I would've gone for a 4k144 instead. Given the choice between a smaller high refresh display, and a larger low refresh one, I went with the sane choice, because being able to see 4x as much text at once is pretty useful, you only need to scroll 1/4th as much. Large format images don't need downscaling to fit, I no longer need a vertical monitor for some formats.
>for actual webpages, there is very little scrolling
Examples of actual webpages? I notice that this thread, for instance, fails your criteria for being an "actual webpage".
I've got to be honest man, I don't think you understand the function of scrolling. You scroll, which is a very fast, sub 100ms action to get to where you want to be on the page, and then you spend a couple of seconds reading it. Do you go line by line? When you read a book, do you get frustrated at how the evil author has forced you to "chunk" every action by turning a page instead of leaving one continuous stream of text that you move past your eyes, instead of moving your eyes to follow?
Lmao what is this drivel. Nobody reads while scrolling. Any decent smooth scrolling implementation will look good on any refresh rate and it's literally something that takes like no time at all. It's literally same shit as with the mouse cursor - you play with it for 10 seconds and then you stop caring about it.
144hz every time. 4K is an auxiliary feature, it's harder to get good performance for and really just means I have more visual clarity, but it doesn't mean a goddamn thing for gameplay besides making out details in the far distance better. If you're buying a computer monitor solely for 4K experience without thinking of additional features, you're being ripped off.
I'll take framerate over resolution but TAA, DLSS, and FSR look like ass below 4k. It sucks that most games now use temporal AA and don't bother to offer a way to downsample.
32" 1440p 120-180hz
More space, reasonable performance, no retarded high dpi that shits the bed on windows. CRT shaders look good.
I have both a 4k60 and a 1440p240 monitor and 95% of the time I just play on the 4k one, but that's just me. Realistically though both are fine, so just get whichever is in your price range.
2 screens
1080p 60hz
Dont need more
definitely 144hz
Why not both?
bump
What monitors?
Left is LG 27gp850-b
Right is Samsung M7 32 inches 4k
Thank you
4k is only worth it on 32" or higher screen sizes, 144hz is a minimum if you play competetive online games
not getting 4K until program scaling issues are no longer a thing
>moving from IPS to VA
yay or nay?
VA has horrible blur the lower hz it has, and still noticable on 144hz especially on black objects. Also watch out for black levels on HDR screens, some look washed out and grey
4k, though I'd go for 144Hz if you're a competitive gaymer which you probably are as a poorfag.
I'm not too poor to justify both, but I can't justify shit HDR and I can't justify anything but IPS either so I'm stuck on my 1080p60 TV waiting for microLED which I am too poor for currently.
The Cooler Master GP27U almost got me but apparently it's got some bad flickering and response time.
This sucks, man.
>calls people poorfag
>cant afford anything above 1080p 60
Stop the elitist larp atleast
I can afford it you retard I'm just not upgrading to anything below essentially perfection.
>I can afford to buy food but I just choose to dumpster dive
>I can't justify anything but IPS either
Samsung's VA is very good, actually.
Odyssey Neo G7 is as close to "perfect" for a non-OLED monitor as it currently gets.
Is it? I haven't given VA serious thought in a while but as far as I remember the bad side is mostly ghosting, being blurry and all that. If the downsides are mitigated and colours are good I might go for it.
I've had this little nigga for 7 years, I'm not actively buying a new poorfag monitor.
>Is it? I haven't given VA serious thought in a while but as far as I remember the bad side is mostly ghosting, being blurry and all that. If the downsides are mitigated and colours are good I might go for it.
I've been using one for a year and I absolutely love it, however, I was 100% aware of what I was getting into. Your mileage might vary.
>The colors are fundamentally great. Not the "professional artist" tier, but more than enough for media consumption.
>It comes fairly well-calibrated as well (I have a hardware color calibrator, so I could check). Not perfect, but decent enough.
>The curve is there and it's not really doing anything useful, but you'll stop caring about it in ten minutes and stop noticing it in a few months.
>The panel being VA still has some color-smearing implications. However, the low response time ranges are much narrower than on other VA panels, so it's exceedingly rare (but not impossible) to encounter the typical VA smear outside of test scenarios.
>The HDR performance is very good but obviously not OLED good. Fully calibrating HDR mode is not very realistic but you can get reasonably close to it being accurate enough
>Local dimming will add some response time delay. It's not noticeable 99.9% of the time in non-test scenarios, but you might notice it fairly clearly in panning scenes in movies, for example.
>Connecting via Displayport will force the monitor use DSC at all times, meaning you can't enable driver features that conflict with it. For Nvidia, that's DLDSR or integral scaling. If you connect via HDMI 2.1 _and set the refresh rate to 120 Hz on the monitor_ you can use those features
>Samsung's QA is shit. Ordering one is a lottery. I got lucky.
So, the bottom line, it's awesome but far from flawless. If you know what to expect - it's an amazing monitor. But be aware of what you are buying. I highly recommend testing one in-person before buying, if you have that opportunity.
>Samsung display
>well calibrated
Enjoy your washed out blacks and obvious backlight
Anon, you might be retarded, but I have literally mentioned that I have a hardware calibrator.
enjoy the 1080p 24" poorfag
>Samsung shills on full force
All i'm saying is that people should go to a store and compare images of simmilar screens, and watch for backlight bleeding, black levels and colour contrast. Do your own research and you will see what I mean.
>list mostly negatives
>be called a shill
You should probably read the posts you are responding to.
>noo dont call out the bads in my product noo
Stfu shill
>actual retard
okay then, do your thing
Yet, cannot disprove my claims of image quality because they are true
Anon you are literally retarded.
I repeat, read the post you are responding to.
Neo G7 has good image quality out of the box.
My claim is that I actually FUCKING CALIBRATED it and I know the difference, while your claim is
>I saw it in a dream
essentially.
You had no argument at the moment you made your retarded statement, which was pointed out.
Next thing, the Neo G7 is an HDR FALD display with 1196 backlight zones. It can't have backlight bleeding by definition.
And it has some of the best contrast ratios for non-OLED displays out there because it can fully shut the backlight off. You'll need an OLED display to actually realize that the blacks Neo G7 have are not "true blacks".
We are talking here about a premium monitor that costs over $1000, not the cheap $20 crap you saw in a garbage bin once and made all of your judgments on.
You have absolutely no idea what are you talking about. Your experience, assuming you actually had any, cannot be applied here. You are out of your line. You are unfit to participate in the conversation.
And yet you keep pushing it and spitting more shit you have absolutely zero understanding of.
Use your head sometimes, for fuck's sake.
>shill
People who use that . and "goy" are literally poor cunts who slate others purchases.
yes I do own a G8 , but I also own a 1440p Asus and a 1440p MSI monitor so am a shill for them too?
I was looking for a good example of smearing during scrolling on Neo G7, didn't find the one I was looking for (there was a nice and simple table with colors and color codes in it, it would make it really easy to see what colors is a given monitor struggling with) but here's a simple example:
>https://ludens.cl/photo/montest.html
on this page, the pic related part has noticeable shimmer when scrolling up and down.
Everything else looks perfectly fine.
On a typical VA monitor, about 30% of the grayscale section examples will have some issues with them, and the black rectangles themselves will visibly jiggle around while scrolling.
Intredasting. I'm still not too keen on the curve, but I'll keep it in mind.
oh wait
>curved
Do you own and use a mouse? If yes than 144hz is the most important.
Doesnt 720p scale up better to 1440p because its double the resolution? Half of 1080p is 540
144hz
Higher fps is more useful than higher resolution. Raising your resolution is one of the biggest drains on performance. Something like 1440p or as its called "2k" is more than great already. 4k is overkill and as I said kills performance. After a point you don't even notice the increase in resolution so it doesn't really matter. You will notice the increase in FPS though. And a 144hz monitor will allow you to see that increase
The truth is they're both memes. They're just for people who get a rush out seeing big number go up. Just get something that can display 1080p at 60fps/hz and you're good if you're on a budget. Everything else is just an excess
Dont know about G7, but G5 is hot garbage
If you can't afford proper display you can't afford hardware to push the frames as well.
I'm still waiting for a decent 4k oled monitor.
>"Just get a C2"
Too large.
PG32UCDM if/when it actually releases.
Just stick with 1080p 165hz unless you for some reason have enough money to get good fps at 1440p without melting your pc
If u dont plan do get a 27+ inch monitor, i would go high refreshrate first.
just buy lg c2 and forget
..to turn it off and destroy the screen with burn in
>fell for burn in meme in 2023
>thinks that because a year goes passed on a calendar that everything is different
If burn in wasn't a thing then why do OLEDs have burn in protection measures?
So you can leave them on auto and forget about burn in?
I have a $75 TV i got from walmart
This is a problem im having.
About to inherit a bit of money.
Never had anything more than this dell 1440p 60hz
Was looking at 4k but there's UHD and DCI, do games even support DCI?
I would probably rather get a 1440p Monitor at 144hz, but i've never had a high hz monitor.
I also do art, so a high res would be better, can't really mix a 144hz 1440p and a 4k 60hz can i?
Since this is the video games board, 144hz.
but I program on my computer too and for that 4k is way more important.
>and for that 4k is way more important.
Are you running your system on 100% scaling?
yes 32"
You have good eyesight, then, I envy you.
I'm running 150% on a 4K 32" myself. I can manage 100% but it's uncomfortable, so for me, there's no real difference between 1440p 32" and 4K 32"
Why couldn't you just use your brain?
It's funny how after years of using 24" 1080p monitors, everyone somehow thinks that using what is essentially no different than a 16" 1080p monitor is going to be fine. I went down to 43" for my 4k monitor, which is a little smaller than ideal, but I just sit a few inches closer. For 32", you've got to sit a lot closer for a usable image.
dumb question but is there any way to prevent dead pixels?
also no harm in leaving it permanently on (idle/sleep mode for the night)?
>dumb question but is there any way to prevent dead pixels?
Pixels rarely die during the life of a given monitor. Almost all monitors that have dead pixels were shipped with them.
So make sure you have checked your monitor for them as soon as you get it but after that, no additional actions are needed.
>also no harm in leaving it permanently on (idle/sleep mode for the night)?
I would assume no. The idle mode does turn the relevant parts of the screen off so it's not actually doing anything. Aside from scenarios that involve stuff like the wiring in your house being shit or a bolt of lightning hitting in, there should be no real threat in keeping them on and idle.
You might want to avoid keeping it permanently displaying something because LCDs are still subject to getting burn-in, even if nowhere to the degree OLEDs are.
>as soon as you get it
yeah I did that with my new monitor and no deads. my nearly 10 year old monitor has one but I'm very certain it didn't ship with it, it could've and I just didn't pay attention until I looked for it but one dead pixel is nothing
>because LCDs are still subject to getting burn-in, even if nowhere to the degree OLEDs are.
oh okay I didn't know that, the worst thing that I have is the tab bar at the top of my browser and I assume that can't hurt much assuming it goes off every day?
thanks overall 🙂
>but I'm very certain it didn't ship with it
It's still possible, just very uncommon. You got unlucky, anon.
>the worst thing that I have is the tab bar at the top of my browser and I assume that can't hurt much assuming it goes off every day?
Yes, you'll be fine. It normally takes some deliberate misuse to get visible burn-in on an LCD so as long as you are using it in a reasonable fashion (i.e. don't leave it on displaying a static bright image for weeks) you won't really need to do anything to avoid it.
LCDs cant get burn in you fuckwit
Now go and do a single google search.
Why dont you provide it instead you fucking loser?
This monkeybrain thinks some old VA with retention from years back = burn in.
>Why dont you
Why don't you shut the fuck up and fuck off then?
You are objectively fucking wrong, and I bet that's not the first time you are saying stupid shit ITT.
I'm not here to educate your stupid ass on things that you should at least fucking google before saying.
What next, do I have to prove to you that you'll die if you stop drinking water?
You have to be at least 18 to post on this website, you illiterate chimpanzee.
Holy fuck, I hate genuine retards.
Try some better bait
1440p is basically just 1080p
80 IQ is basically 100
A 144 hz monitor is cheap as fuck nowadays.
a 4k monitor is still expensive though
so the choice is clear.
Why would it be either or?
A 4K display can perfectly scale 1080p and thanks to DLSS you can play newer games at 1080p and have it look like 4K and get the PPI benefits of 4K.
Inbefore the tech illiterate downy starts sayign BUUUUUUUUT NOOOOOOOOOOOOOO 1080p ON A 4K MONITOR LOOKS BLURRY AS SHIT
I use a 48 inch oled as multiple monitors. 1440p game window takes up most of the screen and i got one or two open windows below or beside the game
I've definitely had burn-in on LCDs, just saying. It doesn't happen much any more but my TV from 15 years ago's picture quality was fucking destroyed by staying on one of those TV guide channels when they were still around. I remember specifically playing MGS2 HD and it still had lines and the cable company logo in the image. Maybe it doesn't qualify as "burn in" because that specific thing will go away, but it was basically just replacing old shit with new shit, even if it wasn't static. There were always weird lines and blurry overlays forever in it, the TV was garbage. I won't get an OLED until it's truly fixed as if it were a modern LCD.
You're confusing plasma with lcd
It was a Philips 47 inch LCD. I've also experienced burn-in on a 4:3 Xerox LCD monitor from 20 years ago.
Don't reply to that retard.
He's too dumb to type
>can LCDs get burn-in
in google and read the result.
He's an example of an utterly computer-illiterate retard who somehow managed to remain this ignorant and stupid in the day and age when even 80-yo grannies have enough technical know-how to do a google search.
This monkey thinks LCDs get burn in.
This monkey doesnt know the difference between plasma and LCD.
This monkey thinks retention is burn in. Even retention is difficult to get on an LCD.
>BUUUUUUUUUUUT MUH EXTREME CASES LIKE 300 NITS OF BRIGHTNESS FOR 24HOURS A DAY for 365 DAYS A YEAR FOR 10 YEARS WILL SURELY BURN IN HAAAAH GOTTEM!!!!!!!!!!!!!!!!!!
This dude is the tranny that says 4K gives higher FOV in games
>This monkey thinks retention is burn in.
Holy fuck, you're even dumber than I thought.
Very impressive.
https://www.rtings.com/tv/tests/longevity-burn-in-test-updates-and-results
>LCDs get burn in
>examples are from 15-20 years ago
I just joined this thread, if you actually read my post I'm trying to make a point that it DID exist, but doesn't much any more, which is why I'm not getting OLED until it's fixed. There will be a day when OLED doesn't burn in.
I'm not trying to shit on LCD, I'm actually defending it.
All screens will burn in if you use them long enough with static content (particularly if blasting brightness). The real difference is whether that happens over a timescale where it matters.
OLED burn in won't be fixed before microLED becomes viable I reckon and by that time lol OLED.
100+hz>real HDR>4k
mobo and graphics card recently fried and now I have to use a 15 inch laptop to shitpost on 4chin
I love me Samsung QN90A QLED
The fact you even ask this is proof this place is shit. It's basically a question between
>smoother, better gameplay
>a little bit prettier picture
32 inch.
1440p.
144hz (or 165hz).
For me it's pure goldilocks zone.
Side by side windows are perfectly large for doing work tasks. Full screen feels great for immersive gaming. Great size for films.
The bump to 4k would be awesome but at that rate I rather put the cost difference into something that would more complete the setup like a better sound system or some kind of ergonomics for comfier desktop time.
4k is a meme resolution. Go for refresh rate.
1440p is gay and retarded and i've never used it or even seen it in real life, and i haven't touched a monitor that has it and it's gay and a poor man's 4k but not 4k
4k is the MINIMUM monitor resolution, 5-8k is best
1440p made some sense over a decade ago when you wanted to spend a little more for a monitor that was a bit bigger. It's gay and retarded now you can get 4k for sub 300 bucks, though.
>It's gay and retarded now you can get 4k for sub 300 bucks, though.
$300 would get you either a decent 1440p monitor or an absolute dogshit 4k.
27" monitor is big enough. Going much bigger would be pretty awkward, I feel like I would need to sit further away from the screen. And 1440p is common 27" resolution
>27" monitor is big enough.
I have 43" monitor, and I don't think it's big enough. 85" would be good for a desk.
i want the new apple xdr display, 36" 7k, i don't care if it's not gaming focused, it will look amazing playing star citizen
>36" 7k
God, apple monkeys are the dumbest retards around.
Imagine wasting all the power on that resolution because you're too stupid to pay for cleartype.
would rather have a good game
I bought xiaomi 200hz monitor probably best purchase so cheap too with no retarded vidya tranny branding or logos.
Objectively correct answer is that it depends on what games you play
IDK. beyond saying something like
>you only play arena shooters? always choose high refresh rates
those people already know that
if you play a variety of things, or maybe slow-paced, it's not as obvious.
If you are not competitive you get a screen that looks better, usually.
>too poor
It's not even a question of money. You just can't run most games with 4k 144hz regardless of setup.
t. 4090, 7800x3d build
What are you running that you can't play at 4K/144 with a 4090/7800x3d?
I don't really understand this. TPU shows that it gets 125fps 1% LOW and 170fps AVG with that setup. That seems quite capable and worthy of saying "it runs 4k 144hz". It literally does it.
4K benefits
>can literally see twice as much detail
144hz benefits
>uh... idk... 60fps looks a bit weird after a few weeks now...?
You have never played an FPS game at >60Hz.
I recently played Max Payne 2 and even 240fps was a noticable improvement over 120fps. 60fpslets make me laugh.
Can you describe the improvement.
less frameyness
it's like a perfect motion blur that doesn't degrade the picture quality
Enjoy your input lag and grey blacks you fucking retard.
Bros what do you think the best possible GPU for 1080p is ? I wanna splurge on something for muh graphix and muh goytracing but I don't feel like upgrading all my monitors to 1440p
4090
Is it really ? I thought a 3070 was already pushing it for such a small res
60FPS is fine for any video game. 120 if you're an online twitch shooter. Smooth frame rate and high level of detail is better than high framerate and ugly graphics.
I use an ultrawide 2K resolution monitor, it does up to 180 FPS, some games I play 180, others 144, others 100, I don't mind 60FPS on single player games. Frame generation is becoming a thing for AMD as well, which can double framerate pretty effortlessly. What I'm saying is I can be more flexible with a 2K monitor setup than a 4K. I've got a $500 graphics card that can run everything fine, if I were running 4K I'd need a $1000-1600 card for comparable performance.
Getting a second monitor soon, recommend a 1440 144hz that's not ridiculously priced and isn't going to shit itself in a month?
1080 240 is goated for old games and cs, shit is so smooth