Get a cheap second hand plasma. 60hz on a plasma looks smoother than even 120hz + BFI on an LCD/OLED because plasma's have absolutely zero motion blur.
>will watching tv in the dark make me go blind too
I mean, it does? Have you looked up the statistics on people in the Western world + Japan + Korea + China and the rates people need glasses? It's gone up like 10x in the past ~40 years. Something like 95% of Korean military age kids need glasses nowadays. Even in the West it's 50%+ because people stare at monitors all day. It's crazy.
you're moronic. it's the chemicals in the water that kill your brain
I looked at the sun for 5 minutes straight from the car(I was 6 or 7yro) while my parents were driving. nothing happened. I have been gaming for 6-12 hours a day since and I still have 20/20
Asians in general have such shitty sòyfilled diets and lack vitamin D so they are much more defective
all the eye doctors I asked told me staring at a screen does not lead to myopia. it's a bullshit myth perpetuated by boomers. they also said myopia is hereditary.
t. keratoconus sufferer fun fact: there's no way to fix it and it advances until you go blind
2 years ago
Anonymous
Should've played outside more.
2 years ago
Anonymous
I'm not gonna watch a video on your college dissertation anon. You should also know that the "eye exercises" Chinks force their kids to endure every day won't do jack shit for you.
>More research is needed.
they don't even know if blue lights are bad but peddle the science to sell products let alone screens in the dark https://www.webmd.com/eye-health/blue-light-health
Good motion blur implementation doesn't spear the entire screen. It's rare that a game gets it correctly but it does happens.
>disliking the image being clear
You're the ultimate npc anon
Not the point, moron. Especially when most people have LCD motion blur as well while I don't.
2 years ago
Anonymous
People hate it because Unreal Engine 4 ruined a generation of games with poor graphics programming, at this point there should just be some kind of list with mb that actually does anything besides being a worse TAA
2 years ago
Anonymous
>Good motion blur implementation doesn't exist
fixed that for you
The image shouldn't be clear when you race at 1000km/h. Your hand doesn't even look clear when you wave it lightly in front of your face, this is something a screen cannot reproduce (the blur from the LCD itself is something else entirely that should be removed)
Motion blur got a bad reputation since the PS3 era but we're moving away from that. 98% of games have bad motion blur, yes, but for that 2% that does it right you're literal Black person if you turn it off.
2 years ago
Anonymous
alright, what's your solution to this problem, then?
2 years ago
Anonymous
turn ingame motionblur on or off depending on the game. Even on a game like Stray the motionblur bothers me because its so strong which is a shame so that's gonna be off from me
However if you play a racing game it's a shame to turn it off. Some games also have just the object motion blur active, which is great.
2 years ago
Anonymous
okay, but real life doesn't have motion blur, though? if you play a racing game with 200 frames per second, don't you get natural motion blur from it, then? i doubt motion blur is necessary under such conditions. remember, a lot of people play racing games with only some 50-something frames or a little bit more. i can imagine playing the same game with 200fps would deliver that much sought-after motion blur effect?
2 years ago
Anonymous
>real life doesn't have motion blur, though
Of course it does, that's the problem, and that's what the effect tries (and often fails, admittedly) to reproduce. Never rode a car in your life or something? if you look at the close scenery on the sides the human eye will just blur everything that's too fast >i can imagine playing the same game with 200fps would deliver that much sought-after motion blur effect?
It won't, a computer screen is completely different from real life (you instead might see LCD motion blur which is not like real life and need to be avoided with BFI), that's why ingame motionblur is needed. But too many games does it incorrectly
Some game like GT7 does it best, the game has motion blur mostly if not entirely on the sides of the screen to realistically blur the close scenery
I had a 144hz monitor, changed it for a 4k one and never looked back it's fricking fantastic.
For gaming I run most stuff at 1440p but browsing and doing everything else there's so much room I barely need to use the second monitor.
Every modern OS has UI/Text scaling, so you'll be fine.
It's just you can use the fact that text on a 4k monitor can be tiny yet still readable to get more working space.
Ok, and?
The point of high resolutions is to make the image look better, if you increase the monitor size you decrease the ppi which makes it look less good
2 years ago
Anonymous
>The point of high resolutions is to make the image look better
the point of 4k is to have 4 monitors for the price of 1. enlarging the text ruins that.
32gb gives you "future proofing". that 16gb if 3200 cl16 ram is some basic-b***h cheapo shit and it stands out in a computer with consoomerist top-end hardware.
Okay, here's the truth and the only answer that matters: once you're used to high fps gaming (144 and above), it will be extremely hard to go back. 60 frames per second will feel very choppy and not fluid at all. I promise you, you won't ever go back again. No video card out there can play the graphically demanding triple A video games of today at those high frame rates at 4K (again 144 and above). If you're sure you barely ever play these types of graphically demanding video games and you prefer your bit more low key indie games, then go ahead and buy the 4k monitor if you've got a good PC and money is no object. Otherwise, lower the resolution so you can come close to that high frame rate.
Anything above 60Hz is a meme. The only thing that matters is visual clarity, and so far only CRT can deliver in that department.
>XMP
That shit has never, ever worked for me.
Messing with timings and rebooting and resetting UEFI settings over and over until things work, then enduring a 4-hour memtest to make sure there are no errors is an absolute pain in the ass.
And god help you if you ever decide to clean up your PC because I sure as hell can guarantee it won't boot up once you put it together and you'll have to go through the same shit again.
I had to overvolt the CPU's memory controller to get it to run. I don't like doing that shit. And once I took out and reseated the one RAM stick, I had to set it up all over again. >bought shit ram
a 16GB XPG Hunter DDR4 3200MHz u-DIMM >a shit board then
an ASUS TUF B450-PLUS GAMING >xmp is one of the easiest things to not frick up.
Never, EVER select XMP and call it a day. It can fry your memory controller. You need to carefully check that everything looks sane.
not that anon but enabling XMP makes my board shit the bed too and makes the PC fail to boot.
God forsaken computers that you morons assembled. I have no idea how XMP can make a PC fail to boot. I've literally never encountered a problem with XMP on dozens if not hundreds of machines
Some z690 boards have a bug where they refuse to make RAM work over certain speed/with xmp
When I upgraded my old ass build and got myself a 12700k, the first mobo I got (MSI) wouldn't let me post with anything over 3000mhz of RAM, tried 2 different kits (a 3600 corsair and a 3200 Team). Furthermore, even when I changed mobo to a ROG Strix the 3600 kit refused to post as well
That was honestly one hell of a week, since I had to spend every single second I wasn't at work trying to figure what the frick was wrong. Had to settle with less than what I wanted, RAM wise, but at least now it works
Pic related
2 years ago
Anonymous
>Some z690 boards
Goes to show that money doesn't buy quality.
2 years ago
Anonymous
>money doesn't buy quality.
Of course not. Just look at how WD burned its brand to the ground after the stunt they pulled with their NAS drives (they surreptitiously swapped them over from CMR to SMR (i.e. less than 10MB/s writing speeds)).
Or just look at how SSD manufacturers shit cheaper, shittier revisions of their SSDs without changing the SKU.
2 years ago
Anonymous
dont worry anon. we have some good drives coming in 2023 and 2024
2 years ago
Anonymous
>Some z690 boards have a bug where they refuse to make RAM work over certain speed/with xmp
wow didn't know this was a thing. that's the kind of exact series of board i got. do they have any other weird quirks? i've been blaming my 12th gen cpu's e-cores for some *slightly* older games running like shit (total war warhammer 2 for example tops out at like 45 fps on a system that should eat it alive).
my board:
https://www.asus.com/us/Motherboards-Components/Motherboards/TUF-Gaming/TUF-GAMING-Z690-PLUS-WIFI-D4/
2 years ago
Anonymous
Afaik that's the only common issue. On my end I had a different one where every second boot would fail, but that was an issue with the motherboard itself. As for older games, I'm yet to see any issue that's not related to my GPU and my autism trying to see what games I can push to 4k 60fps (or the game itself having its own issues)
>weaker than a 3090 >weaker than a 3080ti >trades blows with a 3080, but gets absolutely dumpstered if memetracing is enabled >"3090 performance"
Bro the 3080 is cheaper AND stronger. Where do you get your Copium? I'm gonna need some soon for some upcoming vidya.
put your memory in XMP1 get a 1440p monitor with a high refresh rate like 165hz or 240hz
4k is gonna bog down your system and its not worth the performance loss
>AMD GPU on Windows >16GB of RAM >a shitty 5900X instead of 5800X3D >3200HZ RAM with a ryzen
There is so much wrong with this setup that it's just hard to know where to even begin to fix it.
Probably installing linux so you get the actual performance of that GPU would be a start
Unstable
Weird quirks with some software / games
Weird quirks if not using stock settings in the driver
Only recently caught up to NVIDIA with their AMF encoder. Finally caught up to 2012...
This. Nothing worth playing will play well at good settings at 4K and 60+ fps. Something like 1440p is a nice boost to your resolution (if your monitor is big enough), and you can enjoy high frame rates between 60-120fps, but you need the graphics card to support it. Even on my RTX 3070, I can get around 80-90 fps at 1440p. FPS would plummet at higher resolutions and I'd barely be able to see the difference on a desk monitor.
>(if your monitor is big enough)
you want to keep the monitors size low enough to keep the pixel density high, increasing both size and resolution effectively does nothing or very little in terms of image quality, it will look the same just bigger, and you're pushing your hardware much more for nothing
think about how crisp your phone looks compared to a big monitor
>4k is the biggest meme ever
For vidya? Yes. For productivity? Absolutely not.
split it in 4 1920x1080 screens, achieve multi-monitor glory.
For vidya if you're not moronic you play in full screen and use integer scaling.
1440p is poorgay cope. 4k makes all the difference. Maybe get your eyes checked, anon.
>1440p
dude lmao imagine needing over 1080p
1080p is ugly blurry shit that hinders immersion
>all talking about resolution instead of pixel density
morons
1080p might be okay on 19-24 inch monitors, but anything higher than that and you need to step up resolution
if your computer has not been upgraded in at least 4 years, you aren't going to get 144+fps solid on every game above 1440p, so what is the point
Things won't make you happy anon. I learned this the hard way. No matter how shiny the new toy is, it will always leave you unfulfilled and joyless sooner or later, usually sooner. Use the $700 on some form of self improvement. Don't fill your void with material pleasures like I did.
165hz feel a lot better than 144, i use 240hz but i struggle to see a difference between 163-165fps compared to 237-240fps. remember to cap 3 fps under to make g sync work properly
i would also make sure to buy a monitor with the lowest possible input lag, just google for comparing. same goes for response time and error rate
>this is a meme
Just because you have not experienced it, and I highly doubt your IPS monitor does not have some form of backlight bleed, does not mean it's a meme. Backlight bleed on IPS monitor is very real, maybe not to that extend, but an overwhelming majority amount of IPS panels suffer from backlight bleed.
It's probably a decent enough monitor, though I've never seen that particular model in person. 27" at 4k is close to perfect for use at a desk I think, text looks very good and sharp and so do games. It's especially nice in games since the high DPI makes aliasing much less noticeable and the image is very detailed. I've been using 4k at 27" for about 7 years and there's no going back to low DPI once you've seen it.
Nah it happens, I assume some people are just completely moronic and use the monitor at max brightness in a dark room. Some screens will look similar to that.
This shit is a meme, not once in six years of using an IPS monitor has it ever affected me during actual usage no matter what I'm doing no matter how dark a game is. Meanwhile other panels I've used with the shitty viewing angles directly affected me every fricking day, if I leaned back slightly in my chair the monitor was unusable. THAT matters, this cherrypicked shit doesn't.
I literally bought 3 IPS monitors and they all had insane glow that was immediately noticeable the moment the screen wasn't blasting bright colours, playing fricking terraria was hell.
I will no longer buy garbage outdated technology, they need to move on to oled already.
You know technology evolves.
https://arstechnica.com/gadgets/2022/03/explaining-ips-black-the-display-tech-in-dells-new-ultrasharp-4k-monitors/
New Dell technology can hit 2000:1 static contrast with IPS.
Monitor tech doesn't really evolve, it just eats the scraps of TV tech 10 years later.
I'm sure this """new""" IPS tech still has glow and will still use shitty matte coatings.
165hz feel a lot better than 144, i use 240hz but i struggle to see a difference between 163-165fps compared to 237-240fps. remember to cap 3 fps under to make g sync work properly
i would also make sure to buy a monitor with the lowest possible input lag, just google for comparing. same goes for response time and error rate
165hz isn't a multiple of 60fps so I just can't recommand that. Go either 120hz, or 180hz, or 240hz, anything in between is kind of a meme
>>4K on windows is hell
Not really, you can either go for 150% scaling (which does manage to break some games and making them think your panel is a 2k one) or 100% scaling, which makes text a bit too small until you get used to it (what I'm using and what I prefer) >>4K.. on a 27" size is utter garbage tier
You have no idea what you're talking about
t. 4k 27" monitor user
guys, i'm in the market for a new 4k monitor, what is the literal best one? i'm not rich but i'm imagining that i am, so lay it on me, buds, give me your absolute worst of the worst (the best).
Anybody has an idea why Nier Replicant looks awful with HDR on? Even at 4k res there are these weird black blotches in darker spots, while on SDR it looks just normal
the best shit is. doom eternal will give you 240hz. then you have all the good multiplayer FPS games. then you have all the older shit that's actually good (like classic Doom, quake, etc). thats all u need
nta but the point of 240hz screen is their setup and forget nature (since it'll have correct framepacing no matter the content outside of gaming, something not true for 144hz monitors for example that can't display 60fps youtube videos correctly) and they have response times good enough for most lower framerates as well as full 240fps
Just because you buy 240hz doesn't mean you NEED to get every games to run that fast.
But also just because you get a fast monitor doesn't mean 60FPS will stop looking blurry. You need backlight strobing to make 60fps games look clear again
the c1? isn't the c2 out now, though? the internet seems to agree that the second version is better..?
I hate the C2 because it has no 120hz backlight strobing anymore and slightly worse motion resolution at 60hz strobing. It's also not as cheap as the C1 was a few month ago and unlike the C1 it doesn't have the higher durability panel.
If you have no choice and don't give a shit about the black frame insertion mode at 120hz (a shame because 120FPS ° BFI it looks fricking incredible on my C1, absolutely no motion blur, way better than CRT) I guess the C2 is ok. But I'd wait for WBE panel versions of the C2 at least (it's a lottery)
>for example that can't display 60fps youtube videos correctly)
Anon, there's no v-sync for youtube videos. Having a monitor that works in multiples of 60 (or 30, or 24, or whatever) doesn't mean you won't get screen tearing. You will. In fact, if you're working with a true multiple of the content's framerate, you're more likely to notice it as it's more likely to appear in the same spot.
Another thing is that past ~100 Hz screen tearing becomes so minuscule you won't notice it anyway.
?
This isn't about tearing or vsync, I'm talking about motion judder
A 144hz display won't show the proper framepacing of 60fps content and it's terrible to use day to day. I have to drop my monitor down to 120hz sometimes just because of that. This is true for any refresh rates that's not a multiple of whatever content you're watching. Try it out with this if you won't believe me. 60fps doesn't look smooth on a 144hz panel, but dropping it to 120hz will fix it. But then 72FPS will looks just fine on a 144hz monitor while 72FPS will look terrible on a 120hz monitor
https://www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=1920&framepacingerror=0.1&direction=rtl&framerate=60&compare=2&showfps=1
Freesync does fix this, but as far as I know VRR won't activate on your webpage or video app.
Ah, you mean that.
Well then maybe.
Sounds like a superficial issue your eyes will adapt to in about two seconds.
I probably could notice it in a side-by-side comparison but I haven't ever had any issues watching 30 and 60 FPS content on a 144Hz screen.
>https://www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=1920&framepacingerror=0.1&direction=rtl&framerate=60&compare=2&showfps=1
By the way, checking this comparison made me realize how terrible the response times on my VA monitor are.
I'd notice black smearing all the time and I'd be >Yeah, whatever, it's just a specifically bad case
But looking at the video in both 144 and 120 Hz I was like >Hell no, the """60 FPS""" part is not true 60 I can literally see it jittering back and forth roughly at the same speed as the bottom part
So in an attempt to be fair and scientific about it, I whip out my phone and take a 120 FPS video of my screen.
Then I go through it frame by frame and realize that the top part does indeed update pretty much exactly every two frames but the response time is so fricking shitty, there's like a 30% worth of an afterimage sitting on the screen until the next frame finally makes it go away, creating an illusion of 120 Hz """motion""" on a 60 FPS video.
>absolutely no motion blur,
That's complete BS. It's significantly reduced but it's still visible. We need like 480hz + BFI to get rid of it completely.
>way better than CRT
You're blind. 120hz with BFI still has motion blur. You can see it clearly with the test patterns in this. BFI is a massive improvement but the motion blur is still there.
nta but the point of 240hz screen is their setup and forget nature (since it'll have correct framepacing no matter the content outside of gaming, something not true for 144hz monitors for example that can't display 60fps youtube videos correctly) and they have response times good enough for most lower framerates as well as full 240fps
Just because you buy 240hz doesn't mean you NEED to get every games to run that fast.
But also just because you get a fast monitor doesn't mean 60FPS will stop looking blurry. You need backlight strobing to make 60fps games look clear again
[...]
I hate the C2 because it has no 120hz backlight strobing anymore and slightly worse motion resolution at 60hz strobing. It's also not as cheap as the C1 was a few month ago and unlike the C1 it doesn't have the higher durability panel.
If you have no choice and don't give a shit about the black frame insertion mode at 120hz (a shame because 120FPS ° BFI it looks fricking incredible on my C1, absolutely no motion blur, way better than CRT) I guess the C2 is ok. But I'd wait for WBE panel versions of the C2 at least (it's a lottery)
And you can also use the test patterns on blurbusters - the motion blur is still easily visible with 120hz BFI.
You need a CRT or Plasma to get rid of it completely coz 120hz isn't anywhere near fast enough.
While my monitor is definitely not the LG C1, that doesn't mean it's shit. When HDR works as it should it looks great. On games where it doesn't it looks like absolute ass
>that doesn't mean it's shit
9 times out of 10 it does.
If you aren't using an OLED, mini-LED, or at least something FALD with a shitload of individual lighting zones, you literally cannot see HDR content.
Your monitor can read HDR signal (hence why it's HDR "certified") but it literally lacks the hardware to display it.
Windows can't magically make up for a lack of proper hardware.
Seeing how many games don't even have VR support, no you're wrong. VR was advertised initially as a peripheral that you could slap and play STALKER or whatever atmospheric game, but turns out it is a platform that needs games to be specifically made for it, and people are fiddling with modding to be able to play traditional games on it. Clearly this is not the future of playing games but a separate platform entirely (which is a much less interesting proposal since it needs exclusive games to flourish)
the news aren't good if you want a metaverse reality, people don't like being glued to a VR headset for hours to work on shit
2 years ago
Anonymous
They'll get lighter as they develop more and it's way cooler than gay old rectangle screens that make you lose your immersion when you see the wall behind them.
Okay, here's the truth and the only answer that matters: once you're used to high fps gaming (144 and above), it will be extremely hard to go back. 60 frames per second will feel very choppy and not fluid at all. I promise you, you won't ever go back again. No video card out there can play the graphically demanding triple A video games of today at those high frame rates at 4K (again 144 and above). If you're sure you barely ever play these types of graphically demanding video games and you prefer your bit more low key indie games, then go ahead and buy the 4k monitor if you've got a good PC and money is no object. Otherwise, lower the resolution so you can come close to that high frame rate.
>it will be extremely hard to go back
takes like 30 minutes to re-adjust
t. someone whose 144hz monitor shat the bed so I had to go back to a 60hz monitor.
really not bad at all
Get it. I have the same monitor and if you have a 3080 like me you'll play games in 4k 120fps. 1440p is FRICKING dogshit in comparison. Poorgays will cope and say otherwise.
>3080 like me you'll play games in 4k 120fps
Complete fricking bullshit, your dogshit 3080 gets raped at 4K. Hell, fricking 3090s and 6900s get fricking raped. 80 fps MAX on the most demanding triple A games at 4K. Expect 60 fps on average
Stop spreading misinformation. I easily get 4k 120 fps on epic settings in Fortnite. Halo Infinite also runs native 4k all settings maxxed and high frames. Maybe consider a non dogshit amd cpu that bottlenecks your pc? My 12th gen i7 does wonders paired with my 12 gig RTX 3080.
>27 inch
you should shoot for 32 inch @ 4K minimum at that price bracket.
https://www.ebay.com/itm/304490421697
>spending more on GPUs than the output for that GPU
moron. Not everyone lives by your broke ass spending habits. I take home $6,000 a month and spend almost every waking moment in front of my computer. Why shouldn't I spend $500-$700 on a nice display?
I'm not burning money though. I have an ultrawide 1600p monitor and it's glorious and I paid $740 for it 4 years ago.
It's pretty difficult to craft an argument that the display I stare at for 16 hours a day isn't "worth it". Especially when I can write it off as a work expense or have my workplace pay for it themselves.
tfw just got my sony inzone m9 today but it has a dead pixel. part of me wants to return it however the other part of me realizes it's not noticeable and you can only really see it on flat light colors so 90% of the time I won't even see it. it will still live in the back of my mind but man I don't want to go through the hassle of returning it
LG ultragear screens are fricking awesome. But honestly I like 4K for TV's (to hook your pc to) more and ultrawide 3440/1440 more for computer monitors.
I'd buy Samsung Neo G7 were I to have literally any faith in Samsung's utter shit of a QA.
Looks like a perfect monitor at an acceptable price point for me, could drop the curve but I'd probably get used to it.
Sony - 27” INZONE M9 4K HDR 144Hz Gaming Monitor with Full Array Local Dimming and NVIDIA G-SYNC - White
>Sony’s 27-inch InZone M9 gaming monitor is practically a high-end 4K Bravia TV with a few features (and more than a few inches) chopped off. It can’t serve your channels or streaming apps, but this monitor can deliver full array local dimming for more accurate backlighting, which is something few TVs and even fewer monitors can do. It’s a great monitor if you watch a lot of HDR content on your computer — the matte panel looks bright, detailed, and gorgeous. It’s like watching a really good TV that’s not actually a TV.
>But the main reason to spend $899.99 on the M9 is to play games, and it ticks a long list of boxes that many PC and PS5 gamers have been waiting a long time to see. In addition to stocking a great 4K (3840 x 2140) panel, it has two HDMI 2.1 ports, Nvidia G-Sync compatibility for PC, 144Hz variable refresh rate (VRR), auto low-latency mode (ALLM), and other cool PS5-specific HDR features that display nerds will appreciate. Beyond stuff angled toward pleasing the most particular gamers, it has a slick, PS5-inspired design, USB-C video support, and USB-A passthrough for connecting accessories.
Sony started selling their own gaming monitors and is forcing everyone at EVO right now to use their gaming monitors and gaming headsets under the INZONE branding so all the EVO live streams you are watching have the players using Sony products
AU Optronics is planning on releae, still this year, panels with over 500 zones of local dimming. This monitor is already obsolete for this ridiculous price.
For $900 you can buy the Inzone M9 which measures beautifully and has 96 dimming zones for a very decent HDR experience.
Those LG panels with Nano-IPS have dog shit contrast. Like seriously dogshit. If you don't want to spend $900, at least get the Samsung G7 (IPS) or the Gigabyte M28U.
Get a cheap second hand plasma. 60hz on a plasma looks smoother than even 120hz + BFI on an LCD/OLED because plasma's have absolutely zero motion blur.
If you work on this go ahead. If you only play single player games it will be nice. If you play multi player and still have to ask might as well buy it because you probably won’t notice casual.
Still too big for my desk.
Either way, OLED are notoriously bad for doing actual work - eg. excell, word. Their burn in counter measures, such as reducing brightness, is highly annoying but a damn necessity if the panel wants to last more than 1 year without damaing itself.
2 years ago
Anonymous
Well yea it's a gaming TV. If you need the monitor/TV for work related things then I can see why you wouldn't get the C2.
Besides, if you ever do decide to get it you can just buy a cheap wall mount to fit the TV for viewing.
2 years ago
Anonymous
>If you need the monitor/TV
I want a monitor for everything. Single.
2 years ago
Anonymous
Fair enough, just be aware that it will take years before you see an HDR and OLED on 4k monitors, and I guarantee you those will be expensive as hell at release. Probably more expensive than TVs.
>LG >not OLED >4k at only 27" >$700
for that price bracket, and since you don't seem to care about OLED or any sort of dark color accuracy, you should get something closer to an AORUS FV34U so you can watch movies from far away.
To make a somewhat ridiculous analogy, the monitor is to the PC more or less as the tires are to the car.
It's the monitor that makes the connection between the hardware and the user, so economizing on this part makes no sense. When I upgraded my PC I decided to keep using a 1080p/60hz monitor because I wanted to run everything on Ultra, which I could. But when I finally moved to the next stage 1440p/144hz/g-sync I noticed how much I was missing out.
In my opinion, the higher the resolution and frequency the better, just take into account your pocket and whether your hardware will be able to take advantage of both the high resolution and the high frequency.
Note: IPS is really bad for dark games, be careful with this, Quake is almost unplayable on my monitor.
If you're in the monitor market for gaming, I recommend 1440p 240hz, either samsung VA or some other brand IPS.
240hz is nice for gaming, 1440p can be done with this gen GPUs
4k IPS 144hz is kinda hard to recommend. The PPI is too high to actually get anything out of in most games, you're unlikely to be able to run it full hz/fps without 4000 series gpus, but if you're planning on buying one then go ahead buy a future proof display. Get a 32" tho, 27" 1440p makes too much sense.
If you're spending near 1k € on a display then don't buy IPS. It sucks, it has bad response times, backlight bleed, and is unlikely to do good HDR.
LG 42" c2 is much better option, neo g7 and alienware qd oled are good too.
I just want a nice 1440p panel with HDR and FALD of at least 96 zones. I've seen the Inzome M9 and that thing is really nice for HDR - but it's 4K. There aren't any 1440p with FALD around.
>Alienware QD oled
Not interested in paying with my limb, don't want to deal with oled burn in, and lastly the subpixel layout of that monitor is really weird. Looks like a CRT shadowmask and makes text not very clear. It creates an effect similar to chromatic aberration.
https://images.frandroid.com/wp-content/uploads/2022/03/alienware-aw3423dw-defaut-pixels-1-scaled.jpg
I wouldnt be worried about burn in. Its gonna easily last 5 years.
Despite all its issues its the best gaming monitor.
Its not for office work, but who cares. Who here does office work?
>1440p panel with HDR and FALD of at least 96 zones
I would argue that HDR is a very premium feature and it is weird not to consider going for the mere "higher-end" 4K over 1440p.
You can use a 4K monitor to consume 1440p content - you can't do the reverse. Why would you invest a considerable sum of money in something that is already getting slowly outdated and will become effectively obsolete for the asking price in two-three GPU generations?
Don't you want your monitor to still be relevant five years down the line?
Not to mention, there's more to PCs than just gaming, and even then the >you can't run games at 4K and high refresh rates
is literally not applicable to 99% of the games in existence. >b-but it's old
Who cares? You can run DOOM: Eternal 4K Ultra-Nightmare settings at 80-120 FPS on a 3080.
Is it "old"?
Will you literally never return to it or this generation of games in general?
Will you never play some sort of indie "muh atmosphere" game like The Pathless or something that can probably run on an actual toaster?
You WILL benefit from a 4K monitor, there are no downsides to getting one outside of the price, and as a long-term investment, the price point you are considered can't be reasonably considered "budget".
>Bilinear upscaling looks disgusting.
If you're pixel hunting static images at 700% zoom two inches away from the screen.
You will NOT notice the difference between 4K and 1440p on a 32" screen in a fast-paced 3D action (the genre that benefits the most from high refresh rates) unless you stop and deliberately pay attention.
If you're going for high end monitors with a high end gpu then there the lg 42 c2, neo g7, alienware qd oled
Why burden yourself with a 4k ips monitor after all.
>then there the lg 42 c2, neo g7, alienware qd oled
These cannot be your main desktop monitor. If you can afford (quite often, not even monetarily but in terms of available room/desk space) a secondary monitor dedicated to content consumption exclusively, then sure, going OLED is the best option.
But as a universal monitor option OLED is borderline unusable garbage. It's too dim, it's not good at displaying text, and it WILL burn in rapidly, no matter how much you'll try to prevent that - and dancing around your monitor burn-in problems is not a premium experience.
Neo g7 certainly can be an universally good mixed use monitor.
LG oled and qd oled are obviously superior for content consumption.
2 years ago
Anonymous
Yeah, I'm considering Neo G7 myself but it's Samsung and while their tech is great, their QA is fricking abysmal.
2 years ago
Anonymous
Yeah well what other monitor manufacturer are you going to rely on?
Acer, msi, gigabyte? Their qa is good thanks to using cheap chink IPS panels since the tech is 20y old.
so I'm still using a 1080p 24in 60hz LG monitor
any recommendations for a 144hz monitor with a higher resolution? preferably less than 200 bucks since I'm stingy with money
Yes there will be judder because 165Hz is not a multiple of 24. You can, however, easily drop the frequency to 144Hz. These 165Hz are, more often than not, actually overclocked. You're fine dropping the frequency.
>guessing it's going to be a BIOS setting
What? We're talking monitors here, not PCs. Just go to the display settings and change the frequency.
https://www.howtogeek.com/wp-content/uploads/2018/07/img_5b5113bd9d968.png?trim=1,1&bg-color=000&pad=1,1
Yes there will be judder because 165Hz is not a multiple of 24. You can, however, easily drop the frequency to 144Hz. These 165Hz are, more often than not, actually overclocked. You're fine dropping the frequency.
Another alternative is leaving the monitor at 165Hz and use temporal interpolation. No this is not the godawful vector interpolation TVs do. This adds nothing but instead blends two frames so the transitions are smoother. The player "mpv" can do this as does MPC and VLC I think. Still, dropping the frequency to 144Hz is trivial and doesn't cause any issues.
Eye travel doesn't matter unless you're playing some pajeet game like league of legends or CSGO in which case just move the monitor 2 inches back lmao and get the same effect
Go for it. I have the 850-B which looks identical in terms of stats except 1440p and it's been amazing.
Although I would say for a screen this size 4K seems overkill. That's not sour grapes, it really is a suitable screen size for 1440p. Consider a larger monitor if you're dead set on 4K.
Curved is a shape
Black smearing is not a trend with samsung va's the way it is for cheap va displays
CQ is what it is, buy one and hope it doesnt need to be returned.
>black smearing
You literally have no idea what you're talking about.
2 years ago
Anonymous
Yes I do.
2 years ago
Anonymous
No, you don't.
Its "black smearing" is less pronounced than just "smearing" of a typical 32" 4K IPS panel.
OLED would perform better but everyone knew that already.
>He thinks we are in the early 2010's
It detects when you have a PC or console connected and turns everything off that would frick with your gameplay. Unless you found some article stating that which I missed when researching the TV before buying it.
Anon, you're moronic.
LG C2 is one of the very few "monitors" that actually has a real ~1ms GtG response time that is not just a marketing hoax.
5ms is its real input lag which is processing delay + response time.
>ips are usually in the 10-15ms range
It's not that bad. Decent IPS usually have an average below 10ms. Often 5ms average, with only the worst transitions hitting 13ms.
do you guys know wtf these wavy lines on my display are?
I haven't been able to troubleshoot it or find any other case even remotely resembling this
ASUS VG248QE btw
I have had a ASUS VG248QE for close to 10 years that I use as a 2nd monitor and it has had weird vertical wiggly lines all over for a few years. I assume it's just something that happens when the monitor gets old and half-way dies. I can only really see it on white/grey backgrounds though.
yeah it's a lot less noticeable when I'm doing most things but drawing on a white canvas it's distracting as hell
it happened to both of my displays in different patterns, really bad luck I suppose
Currently on a 27" 144hz WQHD TN panel (lmao).
Not sure if I want to switch to an ultra-wide (34" 21:9) or a 32" 4k 144hz.
Ultrawide is not supported by most games, while I'm not sure if 4k over WQHD makes a big difference.
Price would be the same at around 700€.
for me it's the HP OMEN X >1440p >240hz >fastest response times of any monitor to date outside of OLED >fast response times even at 60hz for that one annoying game that runs poorly >3 years of kino already >soul TN panel that looks just as good as IPS
My LED TV got a malfunction so I just recently ordered this one cause I found a good deal (open box) on it for 550USD including taxes and everything. Will primarily be using it to play my PS5.
Please reassure me that I made the right decision.
just returned one from amazon that was a fricking lemon. Then again it was the a80k. Looked great when it wasn't fricking up, but then again it's literally the same panel as a c2/c1
do curved monitors tend to frick with lines when drawing?
looking to get that fancy qd oled panel from samsung but idk if the curvature (1800R) will have a drastic effect on that
I recently got a Samsung G7 and the answer is, yes. It will 100% frick with lines. Even just browsing Ganker will make all text and the boxes around them seem weirdly slanted at times. I love it for gaming, but you definitely do NOT want to do any kind of drawing/editing/whatever that requires clean lines.
Yes. They all have unavoidable geometric distortion. Are you old enough to remember the CRT days when having a screen that was actually flat was not a given, but instead a highly desirable feature? Now people pay extra to have a monitor which distorts the image instead.
You could adjust it to some degree to minimize geometric distortion but you definitely could not get it perfect on curved CRTs and even flat ones would have trouble getting absolutely perfect geometry across the entire screen like LCDs or OLEDs get by default due to the tech itself.
What do you mean by clarity? LCDs and OLEDs are sharper than CRTs with the caveat that it has to be at native res, also due to the tech itself. CRTs always had some level of softness to them, they were never razor sharp like modern displays are.
2 years ago
Anonymous
motion clarity (i.e. no motion blur/"ghosting"/however the frick it's called).
I spent 700bux on a mag274qrf-qd and I regret it, not because of the money but because there's not that much difference compared to my 24" 1080p monitor. Its what, 30% bigger?
and your framerate tanks about 30% too lmao.
1080p 144hz adaptative sync is more than enough, don't fall for the meme
How do you feel about the 4070 being confirmed at 300w tdp for the base model (aib 350-400) and being priced around $699 (899 aib)? Oh, apparently equal to the 3080ti
Also hilariously enough I have a good PC monitor and the color and contrast are better on that 2007 tv screen I found in the trash.
Whatever is happening to modern technology, it's not improving it.
TV's have fricked response rate, fricked hertz, fricked colors, fricked sharpness for text, and so on. Literally only 40 year old "gamer dads" would even think of using a TV for their PC.
I got bad eyes and with the distance my screen is at I barely notice any difference between 1080 and 4k.
don't need strong hardware for 1080p so that's a real blessing in disguise.
also if you're used to 60hz, DO NOT TRY 120hz, you will not be able to go back, just stay ignorant and save processing power.
700 for no oled??? wtf
>oled
>2022
lol
at least get something like qd-oled or mini led
>qd oled
so worse OLED?
*burns your path*
does your oled phone display have burn in? oh wait you were projecting being poor.
not that anon but seethe oledcuck. your display technology is a meme.
not him but my father's have, but he bought it like 5 years ago
but still, I expect a monitor to last more than 5 years
>my father's
How progressive.
ahem
AHAHAHHA
I bought an OLED Laptop in 2020 and it has pretty noticable burn in.
>watches CNN so much the logo gets burned in
whoever owned this TV deserves it
Nano-led has the infinite contrast of OLED but can get much brighter. You're not up to date, anon.
>27"
>4k
LOL
someone doesnt understand how dpi works and when you are sitting 6" away its a huge difference
>sitting 6" from your monitor
enjoy going blind at 40
he's not using a CRT that blasts radiation in your face
will watching tv in the dark make me go blind too? will breaking a mirror give me 7yrs bad luck? how about black cats, are they bad?
Massive cope. If you're 6" away 1440p to 4k is diminishing returns.
>will watching tv in the dark make me go blind too
I mean, it does? Have you looked up the statistics on people in the Western world + Japan + Korea + China and the rates people need glasses? It's gone up like 10x in the past ~40 years. Something like 95% of Korean military age kids need glasses nowadays. Even in the West it's 50%+ because people stare at monitors all day. It's crazy.
Actually astigmatism is caused by lack of sunlight in childhood. Or just genetics.
you're moronic. it's the chemicals in the water that kill your brain
I looked at the sun for 5 minutes straight from the car(I was 6 or 7yro) while my parents were driving. nothing happened. I have been gaming for 6-12 hours a day since and I still have 20/20
Asians in general have such shitty sòyfilled diets and lack vitamin D so they are much more defective
all the eye doctors I asked told me staring at a screen does not lead to myopia. it's a bullshit myth perpetuated by boomers. they also said myopia is hereditary.
t. keratoconus sufferer fun fact: there's no way to fix it and it advances until you go blind
Should've played outside more.
I'm not gonna watch a video on your college dissertation anon. You should also know that the "eye exercises" Chinks force their kids to endure every day won't do jack shit for you.
>More research is needed.
they don't even know if blue lights are bad but peddle the science to sell products let alone screens in the dark https://www.webmd.com/eye-health/blue-light-health
4k is the biggest meme ever, you will never play on that resolution cause you need a $1500 card to play 4k 60 fps on ultra settings.
You watched a bit too many benchmarks of shitty unoptimized games on settings that make 4k seem as reasonable as having motion blur off
But having motion blur off is reasonable and something you should always have
No, some games have good motion blur implementation. The chimp Black folk who always turn it off no matter what are literally worse than NPCs.
>I want the screen to be smeared in baby oil when I move
Good motion blur implementation doesn't spear the entire screen. It's rare that a game gets it correctly but it does happens.
Not the point, moron. Especially when most people have LCD motion blur as well while I don't.
People hate it because Unreal Engine 4 ruined a generation of games with poor graphics programming, at this point there should just be some kind of list with mb that actually does anything besides being a worse TAA
>Good motion blur implementation doesn't exist
fixed that for you
>disliking the image being clear
You're the ultimate NPC anon
The image shouldn't be clear when you race at 1000km/h. Your hand doesn't even look clear when you wave it lightly in front of your face, this is something a screen cannot reproduce (the blur from the LCD itself is something else entirely that should be removed)
Motion blur got a bad reputation since the PS3 era but we're moving away from that. 98% of games have bad motion blur, yes, but for that 2% that does it right you're literal Black person if you turn it off.
alright, what's your solution to this problem, then?
turn ingame motionblur on or off depending on the game. Even on a game like Stray the motionblur bothers me because its so strong which is a shame so that's gonna be off from me
However if you play a racing game it's a shame to turn it off. Some games also have just the object motion blur active, which is great.
okay, but real life doesn't have motion blur, though? if you play a racing game with 200 frames per second, don't you get natural motion blur from it, then? i doubt motion blur is necessary under such conditions. remember, a lot of people play racing games with only some 50-something frames or a little bit more. i can imagine playing the same game with 200fps would deliver that much sought-after motion blur effect?
>real life doesn't have motion blur, though
Of course it does, that's the problem, and that's what the effect tries (and often fails, admittedly) to reproduce. Never rode a car in your life or something? if you look at the close scenery on the sides the human eye will just blur everything that's too fast
>i can imagine playing the same game with 200fps would deliver that much sought-after motion blur effect?
It won't, a computer screen is completely different from real life (you instead might see LCD motion blur which is not like real life and need to be avoided with BFI), that's why ingame motionblur is needed. But too many games does it incorrectly
Some game like GT7 does it best, the game has motion blur mostly if not entirely on the sides of the screen to realistically blur the close scenery
no games have good motion blur.
why would you play on ultra settings when high looks identical everytime with much better performance
I had a 144hz monitor, changed it for a 4k one and never looked back it's fricking fantastic.
For gaming I run most stuff at 1440p but browsing and doing everything else there's so much room I barely need to use the second monitor.
Sounds more like you just needed a second monitor instead of making everything smaller on a same or similar size screen.
I had and have a second monitor, 4k is much better no comparison.
But what resolution were you coming from and what kind of spastic adhd multitasking are you doing to need a 4k screen to for ontop of a second screen?
>changed it for a 4k one
>For gaming I run most stuff at 1440p
why fricking bother with a 4k monitor then just get another monitor
I don’t get it, does using your computer at 4k shrink everything or something? I’m saving up for a 4k monitor myself
At 27 inch, a 4K resolution will make text absolutely tiny.
Ah shit what’s the next size up that’s still 16:9?
Every modern OS has UI/Text scaling, so you'll be fine.
It's just you can use the fact that text on a 4k monitor can be tiny yet still readable to get more working space.
just make the text bigger?
Then you lose screen estate.
Ok, and?
The point of high resolutions is to make the image look better, if you increase the monitor size you decrease the ppi which makes it look less good
>The point of high resolutions is to make the image look better
the point of 4k is to have 4 monitors for the price of 1. enlarging the text ruins that.
OP here.
Try not to be too jealous, anon.
Jealous of what?
>16gb of ram
>Also just 3200
Uh... one of these specs is not like the other...
>he fell for the 32 GB RAM meme
32 GB is the new sweet spot
32gb gives you "future proofing". that 16gb if 3200 cl16 ram is some basic-b***h cheapo shit and it stands out in a computer with consoomerist top-end hardware.
>future proof
lol
Why do you think it's in airquotes?
>amd gpu
lol you fricked up so so bad anon
6900xt is a good GPU. Explain you memester.
Anything above 60Hz is a meme. The only thing that matters is visual clarity, and so far only CRT can deliver in that department.
Anon, what the frick.
You've paid for 3200C16 RAM and you're running it in JEDEC 2133C15.
Enable XMP, you dumb frick.
>corsair lpx
Probably a better idea for him to not do xmp.
>XMP
That shit has never, ever worked for me.
Messing with timings and rebooting and resetting UEFI settings over and over until things work, then enduring a 4-hour memtest to make sure there are no errors is an absolute pain in the ass.
And god help you if you ever decide to clean up your PC because I sure as hell can guarantee it won't boot up once you put it together and you'll have to go through the same shit again.
You either bought shit ram or a shit board then, xmp is one of the easiest things to not frick up.
not that anon but enabling XMP makes my board shit the bed too and makes the PC fail to boot.
RMA your ram dude. Got one blue screen after getting my 32gb kit and had it replaced, not a single issue since.
Generally, it can be solved by upping your RAM voltage by 0.05-0.1 versus what its XMP is "rated" for.
Everything upwards 1.5V is long-term safe.
I had to overvolt the CPU's memory controller to get it to run. I don't like doing that shit. And once I took out and reseated the one RAM stick, I had to set it up all over again.
>bought shit ram
a 16GB XPG Hunter DDR4 3200MHz u-DIMM
>a shit board then
an ASUS TUF B450-PLUS GAMING
>xmp is one of the easiest things to not frick up.
Never, EVER select XMP and call it a day. It can fry your memory controller. You need to carefully check that everything looks sane.
God forsaken computers that you morons assembled. I have no idea how XMP can make a PC fail to boot. I've literally never encountered a problem with XMP on dozens if not hundreds of machines
t. worked at a computer store
Some z690 boards have a bug where they refuse to make RAM work over certain speed/with xmp
When I upgraded my old ass build and got myself a 12700k, the first mobo I got (MSI) wouldn't let me post with anything over 3000mhz of RAM, tried 2 different kits (a 3600 corsair and a 3200 Team). Furthermore, even when I changed mobo to a ROG Strix the 3600 kit refused to post as well
That was honestly one hell of a week, since I had to spend every single second I wasn't at work trying to figure what the frick was wrong. Had to settle with less than what I wanted, RAM wise, but at least now it works
Pic related
>Some z690 boards
Goes to show that money doesn't buy quality.
>money doesn't buy quality.
Of course not. Just look at how WD burned its brand to the ground after the stunt they pulled with their NAS drives (they surreptitiously swapped them over from CMR to SMR (i.e. less than 10MB/s writing speeds)).
Or just look at how SSD manufacturers shit cheaper, shittier revisions of their SSDs without changing the SKU.
dont worry anon. we have some good drives coming in 2023 and 2024
>Some z690 boards have a bug where they refuse to make RAM work over certain speed/with xmp
wow didn't know this was a thing. that's the kind of exact series of board i got. do they have any other weird quirks? i've been blaming my 12th gen cpu's e-cores for some *slightly* older games running like shit (total war warhammer 2 for example tops out at like 45 fps on a system that should eat it alive).
my board:
https://www.asus.com/us/Motherboards-Components/Motherboards/TUF-Gaming/TUF-GAMING-Z690-PLUS-WIFI-D4/
Afaik that's the only common issue. On my end I had a different one where every second boot would fail, but that was an issue with the motherboard itself. As for older games, I'm yet to see any issue that's not related to my GPU and my autism trying to see what games I can push to 4k 60fps (or the game itself having its own issues)
>Clock 1600 MHz
>amd gpu
jealousy? try pity lmao.
6900XT is RTX3090 performance for $500 less in rasterization homosexual.
OP here. I paid $800 for it at MicroCenter.
>weaker than a 3090
>weaker than a 3080ti
>trades blows with a 3080, but gets absolutely dumpstered if memetracing is enabled
>"3090 performance"
Bro the 3080 is cheaper AND stronger. Where do you get your Copium? I'm gonna need some soon for some upcoming vidya.
Definitely not at 4K.
seven hundred (700) dollary doos for a 27" screen when you could go buy a 6 foot tv for that much
why do you idiots keep buying gamer speed ram and then not even clocking it right
does the TV have the same responsiveness as the monitor? If you're paying the same price for a much slower TV, well that's not so good for gaming
put your memory in XMP1 get a 1440p monitor with a high refresh rate like 165hz or 240hz
4k is gonna bog down your system and its not worth the performance loss
holy moron Black person
>AMD GPU on Windows
>16GB of RAM
>a shitty 5900X instead of 5800X3D
>3200HZ RAM with a ryzen
There is so much wrong with this setup that it's just hard to know where to even begin to fix it.
Probably installing linux so you get the actual performance of that GPU would be a start
>AMD
sell that shit
Can someone on this thread explain why AMD GPUs are supposedly shit? Or is it just the >muh OpenGL meme?
Unstable
Weird quirks with some software / games
Weird quirks if not using stock settings in the driver
Only recently caught up to NVIDIA with their AMF encoder. Finally caught up to 2012...
Better than israelitevidya's ass-ravaging telemetry, plus it works like a charm on Lunix thanks to AMDGPU.
>he doesn't use NVCleanInstall
>he uses Linux
I mean, lol.
>he thinks he can scrub telemetry using random scripts he downloads off the internet
>winbabby
opinion discarded.
>he thinks he's safe from telemtry
oh no no no
>amd gpu
LMAOING AT YOUR LIFE RIGHT NOW MY DUDE HOLY SHIT
>6900XT
>being poor
This. Nothing worth playing will play well at good settings at 4K and 60+ fps. Something like 1440p is a nice boost to your resolution (if your monitor is big enough), and you can enjoy high frame rates between 60-120fps, but you need the graphics card to support it. Even on my RTX 3070, I can get around 80-90 fps at 1440p. FPS would plummet at higher resolutions and I'd barely be able to see the difference on a desk monitor.
>(if your monitor is big enough)
you want to keep the monitors size low enough to keep the pixel density high, increasing both size and resolution effectively does nothing or very little in terms of image quality, it will look the same just bigger, and you're pushing your hardware much more for nothing
think about how crisp your phone looks compared to a big monitor
my 2070s does it just fine. at worst i bump it down to 1440p, which is still very crisp looking.
>ultra settings
an even bigger meme than 4k
>ultra settings
Oh yes whatever will I do with Nvidia's experimental hair works "technology" eating of 90% of the gpu
Or just b Xbox SeX. Holy shit you're delusional.
lol 4k is a meme
4k 32in is God tier for working though
2k 144 32 is the best value atm
yeah, but in five years 4k 144 32 is going to be the best value.
1440p is poorgay cope. 4k makes all the difference. Maybe get your eyes checked, anon.
>1440p
dude lmao imagine needing over 1080p
1080p is ugly blurry shit that hinders immersion
At 27 inch 4K barely makes a difference.
>4k is the biggest meme ever
For vidya? Yes. For productivity? Absolutely not.
split it in 4 1920x1080 screens, achieve multi-monitor glory.
For vidya if you're not moronic you play in full screen and use integer scaling.
>all talking about resolution instead of pixel density
morons
1080p might be okay on 19-24 inch monitors, but anything higher than that and you need to step up resolution
if your computer has not been upgraded in at least 4 years, you aren't going to get 144+fps solid on every game above 1440p, so what is the point
I can't even imagine using bigger than 21 inch monitor
you don't need it, a good $100-$150 monitor is enough, 4k is a fricking meme on pc
27"
4k
The hdr is shit
Things won't make you happy anon. I learned this the hard way. No matter how shiny the new toy is, it will always leave you unfulfilled and joyless sooner or later, usually sooner. Use the $700 on some form of self improvement. Don't fill your void with material pleasures like I did.
So what did you waste your money on? Nvidia cards or TFT?
get the 850
4k is a meme also no oled.
165hz feel a lot better than 144, i use 240hz but i struggle to see a difference between 163-165fps compared to 237-240fps. remember to cap 3 fps under to make g sync work properly
i would also make sure to buy a monitor with the lowest possible input lag, just google for comparing. same goes for response time and error rate
Panasonic
>IPS
Here's why
this is a meme, I have used IPS monitors for a while and never noticed the glow in dark scenes
>this is a meme
Just because you have not experienced it, and I highly doubt your IPS monitor does not have some form of backlight bleed, does not mean it's a meme. Backlight bleed on IPS monitor is very real, maybe not to that extend, but an overwhelming majority amount of IPS panels suffer from backlight bleed.
use your camera phone and increase ISO
It's probably a decent enough monitor, though I've never seen that particular model in person. 27" at 4k is close to perfect for use at a desk I think, text looks very good and sharp and so do games. It's especially nice in games since the high DPI makes aliasing much less noticeable and the image is very detailed. I've been using 4k at 27" for about 7 years and there's no going back to low DPI once you've seen it.
Nah it happens, I assume some people are just completely moronic and use the monitor at max brightness in a dark room. Some screens will look similar to that.
its really not that big of a deal, glow eventually evens out with time and use
It does not
I too love staring at a completely black screen at night with all lights turned off
why are you morons alllllways post some worst example to prove your point? homie you literally posted cheapest chinkshit ips monitor ever.
This shit is a meme, not once in six years of using an IPS monitor has it ever affected me during actual usage no matter what I'm doing no matter how dark a game is. Meanwhile other panels I've used with the shitty viewing angles directly affected me every fricking day, if I leaned back slightly in my chair the monitor was unusable. THAT matters, this cherrypicked shit doesn't.
I literally bought 3 IPS monitors and they all had insane glow that was immediately noticeable the moment the screen wasn't blasting bright colours, playing fricking terraria was hell.
I will no longer buy garbage outdated technology, they need to move on to oled already.
You know technology evolves.
https://arstechnica.com/gadgets/2022/03/explaining-ips-black-the-display-tech-in-dells-new-ultrasharp-4k-monitors/
New Dell technology can hit 2000:1 static contrast with IPS.
Monitor tech doesn't really evolve, it just eats the scraps of TV tech 10 years later.
I'm sure this """new""" IPS tech still has glow and will still use shitty matte coatings.
I've owned two different Asus ips panels and glow was next to nothing.
Can I run two 1080p side monitors with a 4k central monitor?
Yeah
165hz isn't a multiple of 60fps so I just can't recommand that. Go either 120hz, or 180hz, or 240hz, anything in between is kind of a meme
>No good backlight strobing
>4K on windows is hell
>4K.. on a 27" size is utter garbage tier
>not actually 1ms, because there's no BFI
>IPS
>>4K on windows is hell
Not really, you can either go for 150% scaling (which does manage to break some games and making them think your panel is a 2k one) or 100% scaling, which makes text a bit too small until you get used to it (what I'm using and what I prefer)
>>4K.. on a 27" size is utter garbage tier
You have no idea what you're talking about
t. 4k 27" monitor user
>144hz
I wouldn't use it if you paid me
What's your current net worth OP
guys, i'm in the market for a new 4k monitor, what is the literal best one? i'm not rich but i'm imagining that i am, so lay it on me, buds, give me your absolute worst of the worst (the best).
>new 4k monitor, what is the literal best one
LG C1.
the c1? isn't the c2 out now, though? the internet seems to agree that the second version is better..?
No? It’s not? It’s just a bit brighter? Other than that they’re exactly the same? Except the c1 is about $500 cheaper?
I have two words for you
WQXGA monitor
do these even exist in good quality IPS with freesync and high refresh rates?
Aw2721d
Anybody has an idea why Nier Replicant looks awful with HDR on? Even at 4k res there are these weird black blotches in darker spots, while on SDR it looks just normal
Lot of PC games just have broken HDR.
Man that sucks, ones that do work look beautiful
I paid 500 for a 34" with basically those same features about a year and a half ago wtf. Thanks Biden.
Probably doesn't have hdr, which is a moron tax.
The moron:
>4K RTX high ultra settings on cyberpunk and ONLY 45fps? Better stick to 1080p, god 4k is a meme...
The knower
>4K RTX high ultra settings on cyberpunk and 45fps? Cool, that means I should get 200fps on good games
144hz is shit. 240hz is the new thing. go 1440p 240hz.
But with demanding games even the top tier GPUs of today aren't achieving 200 frames anyway
the best shit is. doom eternal will give you 240hz. then you have all the good multiplayer FPS games. then you have all the older shit that's actually good (like classic Doom, quake, etc). thats all u need
nta but the point of 240hz screen is their setup and forget nature (since it'll have correct framepacing no matter the content outside of gaming, something not true for 144hz monitors for example that can't display 60fps youtube videos correctly) and they have response times good enough for most lower framerates as well as full 240fps
Just because you buy 240hz doesn't mean you NEED to get every games to run that fast.
But also just because you get a fast monitor doesn't mean 60FPS will stop looking blurry. You need backlight strobing to make 60fps games look clear again
I hate the C2 because it has no 120hz backlight strobing anymore and slightly worse motion resolution at 60hz strobing. It's also not as cheap as the C1 was a few month ago and unlike the C1 it doesn't have the higher durability panel.
If you have no choice and don't give a shit about the black frame insertion mode at 120hz (a shame because 120FPS ° BFI it looks fricking incredible on my C1, absolutely no motion blur, way better than CRT) I guess the C2 is ok. But I'd wait for WBE panel versions of the C2 at least (it's a lottery)
>for example that can't display 60fps youtube videos correctly)
Anon, there's no v-sync for youtube videos. Having a monitor that works in multiples of 60 (or 30, or 24, or whatever) doesn't mean you won't get screen tearing. You will. In fact, if you're working with a true multiple of the content's framerate, you're more likely to notice it as it's more likely to appear in the same spot.
Another thing is that past ~100 Hz screen tearing becomes so minuscule you won't notice it anyway.
?
This isn't about tearing or vsync, I'm talking about motion judder
A 144hz display won't show the proper framepacing of 60fps content and it's terrible to use day to day. I have to drop my monitor down to 120hz sometimes just because of that. This is true for any refresh rates that's not a multiple of whatever content you're watching. Try it out with this if you won't believe me. 60fps doesn't look smooth on a 144hz panel, but dropping it to 120hz will fix it. But then 72FPS will looks just fine on a 144hz monitor while 72FPS will look terrible on a 120hz monitor
https://www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=1920&framepacingerror=0.1&direction=rtl&framerate=60&compare=2&showfps=1
Freesync does fix this, but as far as I know VRR won't activate on your webpage or video app.
Ah, you mean that.
Well then maybe.
Sounds like a superficial issue your eyes will adapt to in about two seconds.
I probably could notice it in a side-by-side comparison but I haven't ever had any issues watching 30 and 60 FPS content on a 144Hz screen.
>https://www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=1920&framepacingerror=0.1&direction=rtl&framerate=60&compare=2&showfps=1
By the way, checking this comparison made me realize how terrible the response times on my VA monitor are.
I'd notice black smearing all the time and I'd be
>Yeah, whatever, it's just a specifically bad case
But looking at the video in both 144 and 120 Hz I was like
>Hell no, the """60 FPS""" part is not true 60 I can literally see it jittering back and forth roughly at the same speed as the bottom part
So in an attempt to be fair and scientific about it, I whip out my phone and take a 120 FPS video of my screen.
Then I go through it frame by frame and realize that the top part does indeed update pretty much exactly every two frames but the response time is so fricking shitty, there's like a 30% worth of an afterimage sitting on the screen until the next frame finally makes it go away, creating an illusion of 120 Hz """motion""" on a 60 FPS video.
Maybe it is time to upgrade after all.
oh. ok.
>absolutely no motion blur,
That's complete BS. It's significantly reduced but it's still visible. We need like 480hz + BFI to get rid of it completely.
>way better than CRT
You're blind. 120hz with BFI still has motion blur. You can see it clearly with the test patterns in this. BFI is a massive improvement but the motion blur is still there.
https://artemiourbina.itch.io/240p-test-suite
And you can also use the test patterns on blurbusters - the motion blur is still easily visible with 120hz BFI.
You need a CRT or Plasma to get rid of it completely coz 120hz isn't anywhere near fast enough.
Isn't 27" too small for 4k?
Just get the 1440p one
I'd get something with propper HDR even if it means waiting.
HDR on WIndows is shit not because of displays, but because of Windows
No, you have no idea what you're talking about.
HDR on Windows works perfectly fine - you just don't have a monitor that can actually do HDR.
While my monitor is definitely not the LG C1, that doesn't mean it's shit. When HDR works as it should it looks great. On games where it doesn't it looks like absolute ass
>that doesn't mean it's shit
9 times out of 10 it does.
If you aren't using an OLED, mini-LED, or at least something FALD with a shitload of individual lighting zones, you literally cannot see HDR content.
Your monitor can read HDR signal (hence why it's HDR "certified") but it literally lacks the hardware to display it.
Windows can't magically make up for a lack of proper hardware.
>27"
>4K
>144Hz
If you gonna buy a 4K panel go for a bigger good one at 60Hz that's not marketed towards gaymerz like the Dell ones
It isn't a vr immersive experience so its a dud. The future of all displays will be full FoV on a face mounted screen.
Seeing how many games don't even have VR support, no you're wrong. VR was advertised initially as a peripheral that you could slap and play STALKER or whatever atmospheric game, but turns out it is a platform that needs games to be specifically made for it, and people are fiddling with modding to be able to play traditional games on it. Clearly this is not the future of playing games but a separate platform entirely (which is a much less interesting proposal since it needs exclusive games to flourish)
I'm not talking about just gaming, the internet and pc use in general will be a full 3d experience in vr and old style monitors will be dead.
Ok call me when that future arrives because it looks to be nowhere in sight
Zucc is working on it and the monitor powers that be are trying to prevent the awakening.
the news aren't good if you want a metaverse reality, people don't like being glued to a VR headset for hours to work on shit
They'll get lighter as they develop more and it's way cooler than gay old rectangle screens that make you lose your immersion when you see the wall behind them.
Okay, here's the truth and the only answer that matters: once you're used to high fps gaming (144 and above), it will be extremely hard to go back. 60 frames per second will feel very choppy and not fluid at all. I promise you, you won't ever go back again. No video card out there can play the graphically demanding triple A video games of today at those high frame rates at 4K (again 144 and above). If you're sure you barely ever play these types of graphically demanding video games and you prefer your bit more low key indie games, then go ahead and buy the 4k monitor if you've got a good PC and money is no object. Otherwise, lower the resolution so you can come close to that high frame rate.
>it will be extremely hard to go back
takes like 30 minutes to re-adjust
t. someone whose 144hz monitor shat the bed so I had to go back to a 60hz monitor.
really not bad at all
Get it. I have the same monitor and if you have a 3080 like me you'll play games in 4k 120fps. 1440p is FRICKING dogshit in comparison. Poorgays will cope and say otherwise.
>3080 like me you'll play games in 4k 120fps
Complete fricking bullshit, your dogshit 3080 gets raped at 4K. Hell, fricking 3090s and 6900s get fricking raped. 80 fps MAX on the most demanding triple A games at 4K. Expect 60 fps on average
Stop spreading misinformation. I easily get 4k 120 fps on epic settings in Fortnite. Halo Infinite also runs native 4k all settings maxxed and high frames. Maybe consider a non dogshit amd cpu that bottlenecks your pc? My 12th gen i7 does wonders paired with my 12 gig RTX 3080.
HAHAHAHAHAHAHAHAHAHAHHAHAHAHAHA. Nice 120 fps bro. You barely get fricking 60 you dumb frick. Stop lying about your dogshit rig.
>not even a single good game benchmarked
Why should I care?
>mentions halo and fortnite
>video has both
Feels good man. 4k 120fps in Halo Infinite with my 12th gen i7 and 12 gig RTX 3080
one monitor
>IPS
>height not adjustable
I have this monitor. You can adjust the height. Its the best 27 inch 4k monitor you can buy.
You’ll never play a modern game at 4K 144Hz not at least for the next decade.
Ohhh nyoooooo not le heckin farcry 12 and assassins creed 17!!! I'll have to play good games instead I guess....
i'm doing it in another window right this second.
Never spend more than 300$ on a monitor unless you use it for design work
>27 inch
you should shoot for 32 inch @ 4K minimum at that price bracket.
https://www.ebay.com/itm/304490421697
>spending more on GPUs than the output for that GPU
moron. Not everyone lives by your broke ass spending habits. I take home $6,000 a month and spend almost every waking moment in front of my computer. Why shouldn't I spend $500-$700 on a nice display?
>Why shouldn't I spend $500-$700 on a nice display?
Because its never as nice as the pricetag implies and youre getting ripped off. If you really want to burn money so bad there are better ways.
I'm not burning money though. I have an ultrawide 1600p monitor and it's glorious and I paid $740 for it 4 years ago.
It's pretty difficult to craft an argument that the display I stare at for 16 hours a day isn't "worth it". Especially when I can write it off as a work expense or have my workplace pay for it themselves.
>IPS
just pay up $300 more to get a 55 inch LG C1 oled
wait for the future.
80 ppi...
>all those moronic effects
imagine the input lag.
Where is the microLED technology that was hyped up to fix everything that was a problem with OLED 10 years ago such as burn in
Not happening anytime soon.
What do you need it for? So you can play no games and browse Ganker in 144hz?
>LG
Get an Asus or a Dell, moron
tfw just got my sony inzone m9 today but it has a dead pixel. part of me wants to return it however the other part of me realizes it's not noticeable and you can only really see it on flat light colors so 90% of the time I won't even see it. it will still live in the back of my mind but man I don't want to go through the hassle of returning it
return it moron, before it get more dead pixels
LG ultragear screens are fricking awesome. But honestly I like 4K for TV's (to hook your pc to) more and ultrawide 3440/1440 more for computer monitors.
You don't have a computer that can run games on 4K@144Hz. Not even close.
i'll check back in five-ten years from now and see if the monitors are better then.
stop buying monitors, theyre ripping off "gaymers" on purpose. i bought a 65" tv with 4k 120hz for only $600 (hisense)
>$700 for an IPS panel................
based moron
If it doesn't have BFI you're wasting your fricking money.
I don't like LG monitors. Prefer MSI. Both better than ASUS though.
Ignorant
I'd buy Samsung Neo G7 were I to have literally any faith in Samsung's utter shit of a QA.
Looks like a perfect monitor at an acceptable price point for me, could drop the curve but I'd probably get used to it.
Sony - 27” INZONE M9 4K HDR 144Hz Gaming Monitor with Full Array Local Dimming and NVIDIA G-SYNC - White
>Sony’s 27-inch InZone M9 gaming monitor is practically a high-end 4K Bravia TV with a few features (and more than a few inches) chopped off. It can’t serve your channels or streaming apps, but this monitor can deliver full array local dimming for more accurate backlighting, which is something few TVs and even fewer monitors can do. It’s a great monitor if you watch a lot of HDR content on your computer — the matte panel looks bright, detailed, and gorgeous. It’s like watching a really good TV that’s not actually a TV.
>But the main reason to spend $899.99 on the M9 is to play games, and it ticks a long list of boxes that many PC and PS5 gamers have been waiting a long time to see. In addition to stocking a great 4K (3840 x 2140) panel, it has two HDMI 2.1 ports, Nvidia G-Sync compatibility for PC, 144Hz variable refresh rate (VRR), auto low-latency mode (ALLM), and other cool PS5-specific HDR features that display nerds will appreciate. Beyond stuff angled toward pleasing the most particular gamers, it has a slick, PS5-inspired design, USB-C video support, and USB-A passthrough for connecting accessories.
https://www.bestbuy.com/site/sony-27-inzone-m9-4k-hdr-144hz-gaming-monitor-with-full-array-local-dimming-and-nvidia-g-sync-white/6512813.p?skuId=6512813
Sony started selling their own gaming monitors and is forcing everyone at EVO right now to use their gaming monitors and gaming headsets under the INZONE branding so all the EVO live streams you are watching have the players using Sony products
Why would you buy this over the dell g3223q?
150 bucks just for fald?
AU Optronics is planning on releae, still this year, panels with over 500 zones of local dimming. This monitor is already obsolete for this ridiculous price.
no point in getting 4k for 27" stick with 1440p
dont buy it dont be a consoomer
spend it in your everyday life instead
a steak and beers
going out
i dont know but games are overrated
Never pay more than $200 for a monitor.
700 for a tiny screen?
4k doesn't even matter on a screen that small. You van't see the difference from 1080p.
For $900 you can buy the Inzone M9 which measures beautifully and has 96 dimming zones for a very decent HDR experience.
Those LG panels with Nano-IPS have dog shit contrast. Like seriously dogshit. If you don't want to spend $900, at least get the Samsung G7 (IPS) or the Gigabyte M28U.
You're poor.
Get a cheap second hand plasma. 60hz on a plasma looks smoother than even 120hz + BFI on an LCD/OLED because plasma's have absolutely zero motion blur.
>4k for a 27 inch monitor
What a waste, just get a 1440p monitor and call it a day, resolution is a meme anyway, it's the refresh rate you want
add some more get a lg c1 instead and use as monitor.
>4k
Save yourself a few hundred and get the MSI MAG274QRX or Gigabyte M27Q-X
>1440p
What is this, 2016? Do you also play your consoles on a plasma television?
Do you play with your monitor 20ft away from you? No? Then you're just being a fricking idiot.
If you work on this go ahead. If you only play single player games it will be nice. If you play multi player and still have to ask might as well buy it because you probably won’t notice casual.
>b-but consoles
Imagine buying semi decent pc parts to emulate the pleb experience, is this bait?
4k is not a meme guys. The more you try to force this the more your poverty shows. Consoles are running 1440 minimum now.
In the current year, yes it's a meme abd not worth it at fricking all.
>2005
>1680x1050 monitor
>2022
>XD262gay26BBC 69`` Ultramega ADHD (474626x1367271) 666 GHz refresh 0,00001 mms, butt sync graphics, ass infused reality substitute monitor 262462%%%? - Black Led lid monitor + free dlc
>2005
>1680x1050 monitor
and they have the nerve to still consider 1080p high definition, let alone 720p
homie, just get a Samsung C1 it works great as a monitor for gaming, but you still have to use a remote to control since it is a TV still.
https://www.amazon.com/dp/B08WFK81RH/ref=twister_B09GQJ1RR7?_encoding=UTF8&psc=1
> 48"
No.
There are other sizes too.
Shrink it down to 32 and I'll consider.
Just get the C2.
It's 42'
https://www.amazon.com/dp/B09RMFZZPX/ref=twister_B0B3S7JVND?_encoding=UTF8&psc=1
Still too big for my desk.
Either way, OLED are notoriously bad for doing actual work - eg. excell, word. Their burn in counter measures, such as reducing brightness, is highly annoying but a damn necessity if the panel wants to last more than 1 year without damaing itself.
Well yea it's a gaming TV. If you need the monitor/TV for work related things then I can see why you wouldn't get the C2.
Besides, if you ever do decide to get it you can just buy a cheap wall mount to fit the TV for viewing.
>If you need the monitor/TV
I want a monitor for everything. Single.
Fair enough, just be aware that it will take years before you see an HDR and OLED on 4k monitors, and I guarantee you those will be expensive as hell at release. Probably more expensive than TVs.
get the Dell QD OLED
700 / 12 = ~60 bucks
Have 60 bucks a month to splurge? Go for it.
>LG
>not OLED
>4k at only 27"
>$700
for that price bracket, and since you don't seem to care about OLED or any sort of dark color accuracy, you should get something closer to an AORUS FV34U so you can watch movies from far away.
If I'm dropping that much on a monitor I'm just going to get an OLED TV that can do 4k/120 hz and find a way to make it work.
Not enough LEDs
what game do you need to play at 4k and get 144Hz? is your gpu even capable?
27" is too small for 4k
get a 24" 1080 240Hz and a 32" 4k 75Hz instead
Have yall been living in a rock? Starfield & Ragnarok gonna be out soon.
Oled on a gaming monitor is just begging for burn in.
Wait for monitors with DP 2.0 to come out.
monitor is 10-20": 1080p
monitor is 20-27": 1440p
monitor is > 30": 4k
To make a somewhat ridiculous analogy, the monitor is to the PC more or less as the tires are to the car.
It's the monitor that makes the connection between the hardware and the user, so economizing on this part makes no sense. When I upgraded my PC I decided to keep using a 1080p/60hz monitor because I wanted to run everything on Ultra, which I could. But when I finally moved to the next stage 1440p/144hz/g-sync I noticed how much I was missing out.
In my opinion, the higher the resolution and frequency the better, just take into account your pocket and whether your hardware will be able to take advantage of both the high resolution and the high frequency.
Note: IPS is really bad for dark games, be careful with this, Quake is almost unplayable on my monitor.
If you're in the monitor market for gaming, I recommend 1440p 240hz, either samsung VA or some other brand IPS.
240hz is nice for gaming, 1440p can be done with this gen GPUs
4k IPS 144hz is kinda hard to recommend. The PPI is too high to actually get anything out of in most games, you're unlikely to be able to run it full hz/fps without 4000 series gpus, but if you're planning on buying one then go ahead buy a future proof display. Get a 32" tho, 27" 1440p makes too much sense.
If you're spending near 1k € on a display then don't buy IPS. It sucks, it has bad response times, backlight bleed, and is unlikely to do good HDR.
LG 42" c2 is much better option, neo g7 and alienware qd oled are good too.
I just want a nice 1440p panel with HDR and FALD of at least 96 zones. I've seen the Inzome M9 and that thing is really nice for HDR - but it's 4K. There aren't any 1440p with FALD around.
Yeah, samsung started doing minileds in 2021 with tvs and now 2022 monitors, but they're curved VA 4k.
Alienware QD oled is next best thing as its less demanding than full 4k. It provides great hdr gaming experience.
>Alienware QD oled
Not interested in paying with my limb, don't want to deal with oled burn in, and lastly the subpixel layout of that monitor is really weird. Looks like a CRT shadowmask and makes text not very clear. It creates an effect similar to chromatic aberration.
https://images.frandroid.com/wp-content/uploads/2022/03/alienware-aw3423dw-defaut-pixels-1-scaled.jpg
I wouldnt be worried about burn in. Its gonna easily last 5 years.
Despite all its issues its the best gaming monitor.
Its not for office work, but who cares. Who here does office work?
Turn off ClearType homosexual.
the text is just a software incompatibility since microsoft is trash, you can fix it yourself with mactype
>1440p panel with HDR and FALD of at least 96 zones
I would argue that HDR is a very premium feature and it is weird not to consider going for the mere "higher-end" 4K over 1440p.
You can use a 4K monitor to consume 1440p content - you can't do the reverse. Why would you invest a considerable sum of money in something that is already getting slowly outdated and will become effectively obsolete for the asking price in two-three GPU generations?
Don't you want your monitor to still be relevant five years down the line?
Not to mention, there's more to PCs than just gaming, and even then the
>you can't run games at 4K and high refresh rates
is literally not applicable to 99% of the games in existence.
>b-but it's old
Who cares? You can run DOOM: Eternal 4K Ultra-Nightmare settings at 80-120 FPS on a 3080.
Is it "old"?
Will you literally never return to it or this generation of games in general?
Will you never play some sort of indie "muh atmosphere" game like The Pathless or something that can probably run on an actual toaster?
You WILL benefit from a 4K monitor, there are no downsides to getting one outside of the price, and as a long-term investment, the price point you are considered can't be reasonably considered "budget".
>You can use a 4K monitor to consume 1440p content
Bilinear upscaling looks disgusting.
> inb4 lanczos
Still looks bad and adds ringing.
>Bilinear upscaling looks disgusting.
If you're pixel hunting static images at 700% zoom two inches away from the screen.
You will NOT notice the difference between 4K and 1440p on a 32" screen in a fast-paced 3D action (the genre that benefits the most from high refresh rates) unless you stop and deliberately pay attention.
If you're going for high end monitors with a high end gpu then there the lg 42 c2, neo g7, alienware qd oled
Why burden yourself with a 4k ips monitor after all.
>then there the lg 42 c2, neo g7, alienware qd oled
These cannot be your main desktop monitor. If you can afford (quite often, not even monetarily but in terms of available room/desk space) a secondary monitor dedicated to content consumption exclusively, then sure, going OLED is the best option.
But as a universal monitor option OLED is borderline unusable garbage. It's too dim, it's not good at displaying text, and it WILL burn in rapidly, no matter how much you'll try to prevent that - and dancing around your monitor burn-in problems is not a premium experience.
Neo g7 certainly can be an universally good mixed use monitor.
LG oled and qd oled are obviously superior for content consumption.
Yeah, I'm considering Neo G7 myself but it's Samsung and while their tech is great, their QA is fricking abysmal.
Yeah well what other monitor manufacturer are you going to rely on?
Acer, msi, gigabyte? Their qa is good thanks to using cheap chink IPS panels since the tech is 20y old.
so I'm still using a 1080p 24in 60hz LG monitor
any recommendations for a 144hz monitor with a higher resolution? preferably less than 200 bucks since I'm stingy with money
>so I'm still using a 1080p 24in 60hz LG monitor
Hi, me.
This is the cheapest, best 1440p 165Hz monitor money can buy right now.
i skipped over all non 24-multiple monitors since there will be screen tearing when watching stuff like anime and movies
or am i wrong?
Yes there will be judder because 165Hz is not a multiple of 24. You can, however, easily drop the frequency to 144Hz. These 165Hz are, more often than not, actually overclocked. You're fine dropping the frequency.
>you can drop the frequency
thanks I didn't know that. guessing it's going to be a BIOS setting? I'll see once I get my hands on the monitor
>guessing it's going to be a BIOS setting
What? We're talking monitors here, not PCs. Just go to the display settings and change the frequency.
https://www.howtogeek.com/wp-content/uploads/2018/07/img_5b5113bd9d968.png?trim=1,1&bg-color=000&pad=1,1
Another alternative is leaving the monitor at 165Hz and use temporal interpolation. No this is not the godawful vector interpolation TVs do. This adds nothing but instead blends two frames so the transitions are smoother. The player "mpv" can do this as does MPC and VLC I think. Still, dropping the frequency to 144Hz is trivial and doesn't cause any issues.
stop being me, doubles
t. 2x 1080p 24in 60hz LG
you can get a 1440@144 with similar high-quality specs for just over half as much
Take the LG C2 pill and get that 42inch model.
>IPS
>matte blur coating
enjoy your literal 0 contrast monitor, you won't be able to see anything behind all the glow, bleed, and diffusion.
If you're gaming then it's moronic to have a screen bigger than 24" because of the eye travel.
it depends how deep your desk is, 24inches feels a bit too small from where i'm sitting so 27inches is perfect
Eye travel doesn't matter unless you're playing some pajeet game like league of legends or CSGO in which case just move the monitor 2 inches back lmao and get the same effect
Go for it. I have the 850-B which looks identical in terms of stats except 1440p and it's been amazing.
Although I would say for a screen this size 4K seems overkill. That's not sour grapes, it really is a suitable screen size for 1440p. Consider a larger monitor if you're dead set on 4K.
>$700
>Not GIGABYTE M32U 32
Who doesnt want display stream compression up the ass after all?
That's why you should go with the Samsung G7.
Dont fricking buy a 27" 4k display, come on guys.
Just buy OLED bruh.
for use in a normal room you're probably not going to turn the brightness of any monitor above 80 nits
Name one montior with the following specs. If you cannot then all existing monitor don't even pass the basics of 4K game:
1. 120+ Hz refresh rate
2. Gsync/Free Sync
3. QD-OLED/OLED
4. BFI
5. 1000 nits peak brightness
6. 16:9 aspect ratio
7. Not Curved
8. 32 - 40in Monitor size
>5. 1000 nits peak brightness
Actual meme.
Sony inzone
Just a regular, shitty IPS by AU Optronics.
Just buy neo g7 and stop being so angry.
Its a clear improvement over the chink IPS panel era.
> curved
> black smearing
> awful samsung qa
Not very convinced.
Curved is a shape
Black smearing is not a trend with samsung va's the way it is for cheap va displays
CQ is what it is, buy one and hope it doesnt need to be returned.
>black smearing
You literally have no idea what you're talking about.
Yes I do.
No, you don't.
Its "black smearing" is less pronounced than just "smearing" of a typical 32" 4K IPS panel.
OLED would perform better but everyone knew that already.
LG C2
>120 Hz
>Gsync/Free Sync
>OLED
>VRR
>4x HDMI 2.1
>42 inches
>Low input lag for a TV (5ms)
And here I thought I had the latest TV in this thread.
how do you live like this
Cause it's all I have.
Nah, it's an old room that's getting redone, this will look a whole lot better.
>5ms
>he fell for the GtG response time meme
wew, lad. hope it doesn't have any interpolation or post-processing that adds input lag.
>He thinks we are in the early 2010's
It detects when you have a PC or console connected and turns everything off that would frick with your gameplay. Unless you found some article stating that which I missed when researching the TV before buying it.
Anon, you're moronic.
LG C2 is one of the very few "monitors" that actually has a real ~1ms GtG response time that is not just a marketing hoax.
5ms is its real input lag which is processing delay + response time.
All oleds have that 1ms response time
the neo g8 has <3 ms
ips are usually in the 10-15ms range
>ips are usually in the 10-15ms range
It's not that bad. Decent IPS usually have an average below 10ms. Often 5ms average, with only the worst transitions hitting 13ms.
No BFI
isn't the odyssey g7 considered the best monitor around still if you can deal with the curve shit?
the 240hz 1440p va g7 is good, one of the best atm
The neo g7 is better but most people lack the gpu power to run 4k 165hz.
Some some 240hz ips monitors that are flat.
do you guys know wtf these wavy lines on my display are?
I haven't been able to troubleshoot it or find any other case even remotely resembling this
ASUS VG248QE btw
I have had a ASUS VG248QE for close to 10 years that I use as a 2nd monitor and it has had weird vertical wiggly lines all over for a few years. I assume it's just something that happens when the monitor gets old and half-way dies. I can only really see it on white/grey backgrounds though.
yeah it's a lot less noticeable when I'm doing most things but drawing on a white canvas it's distracting as hell
it happened to both of my displays in different patterns, really bad luck I suppose
Currently on a 27" 144hz WQHD TN panel (lmao).
Not sure if I want to switch to an ultra-wide (34" 21:9) or a 32" 4k 144hz.
Ultrawide is not supported by most games, while I'm not sure if 4k over WQHD makes a big difference.
Price would be the same at around 700€.
>while I'm not sure if 4k over WQHD makes a big difference.
It will but it's nowhere near as big as going from 1080p to 1440p.
>Ultrawide is not supported by most games
You know just because you say something doesn't make it true right?
console cope
is there a point to 1080p in [current year]?
Cheap TVs for sure.
>point to 1080p
Actually playing games at a stable 60 FPS.
>27"
>3840
you dont need anything above 1080
use a tv
>4K
>144hz
Is it 2015 again?
The only monitors worth buying are TN monitors and definitely not for $700. Entire monitor market is dogshit. Not a single good one.
Are there any TN monitors for $700+?
for me it's the HP OMEN X
>1440p
>240hz
>fastest response times of any monitor to date outside of OLED
>fast response times even at 60hz for that one annoying game that runs poorly
>3 years of kino already
>soul TN panel that looks just as good as IPS
>HP
No thanks
ok
I will never give HP my money.
They can frick right off.
Because of them, I live in a world where printers are a scam.
Any recommendations for good 32" monitors that I can use for both art and games? I usually prefer single player RPGs anyway
My LED TV got a malfunction so I just recently ordered this one cause I found a good deal (open box) on it for 550USD including taxes and everything. Will primarily be using it to play my PS5.
Please reassure me that I made the right decision.
It will get the job done for console vidya gayming
just returned one from amazon that was a fricking lemon. Then again it was the a80k. Looked great when it wasn't fricking up, but then again it's literally the same panel as a c2/c1
>$700
>for meme specs
Hahah ah, great, go ahead, waste your money, consumer.
so guessing the Switch will look like shit on 1440p?
do curved monitors tend to frick with lines when drawing?
looking to get that fancy qd oled panel from samsung but idk if the curvature (1800R) will have a drastic effect on that
yeah you will want a flat panel for drawing
I recently got a Samsung G7 and the answer is, yes. It will 100% frick with lines. Even just browsing Ganker will make all text and the boxes around them seem weirdly slanted at times. I love it for gaming, but you definitely do NOT want to do any kind of drawing/editing/whatever that requires clean lines.
Yes. They all have unavoidable geometric distortion. Are you old enough to remember the CRT days when having a screen that was actually flat was not a given, but instead a highly desirable feature? Now people pay extra to have a monitor which distorts the image instead.
You could adjust the image on to your liking on CRTs though.
You could adjust it to some degree to minimize geometric distortion but you definitely could not get it perfect on curved CRTs and even flat ones would have trouble getting absolutely perfect geometry across the entire screen like LCDs or OLEDs get by default due to the tech itself.
>LCDs or OLEDs
The clarity trade-off was huge though. I'm still holding out hope for a CRT successor like SED or FED.
What do you mean by clarity? LCDs and OLEDs are sharper than CRTs with the caveat that it has to be at native res, also due to the tech itself. CRTs always had some level of softness to them, they were never razor sharp like modern displays are.
motion clarity (i.e. no motion blur/"ghosting"/however the frick it's called).
Get the 32" 1440 one. I have 2 of them they're great.
I spent 700bux on a mag274qrf-qd and I regret it, not because of the money but because there's not that much difference compared to my 24" 1080p monitor. Its what, 30% bigger?
and your framerate tanks about 30% too lmao.
1080p 144hz adaptative sync is more than enough, don't fall for the meme
just use integer scaling and play vidya in 1920x1080 you dingus.
that monitor is 1440p bro
4k is too much of a meme
27 inch is for 1440p
get 32 inch
How do you feel about the 4070 being confirmed at 300w tdp for the base model (aib 350-400) and being priced around $699 (899 aib)? Oh, apparently equal to the 3080ti
>4K on a 27 inch screen
Anon why
Great image quality
get Gigabyte AORUS FO48U bigger is better
It's not going to increase the amount of dopamine your brain creates.
t. Playing on a 720p screen I found in the trash for free.
Also hilariously enough I have a good PC monitor and the color and contrast are better on that 2007 tv screen I found in the trash.
Whatever is happening to modern technology, it's not improving it.
Is the gigabyte G27Q enough or should I spend the extra 50 to 100 dollars for the M27Q?
I got the $400 49" ultrawide deal. Hope it's good
How is 32inch 1440p? Never seen one IRL
too big.
24" is the way to go.
God I want a 4k setup. Need a better job
>1080p
>144hz
Let me guess. You "need" more.
You don't have a computer that can take advantage of that resolution and refresh rate combination. Nobody does.
>cope of being poor
>not having an ultramega widescreen
absolute pleb
if youre not using a TV as a monitor, what are you even doing with your life?
monitors are a meme
TV's have fricked response rate, fricked hertz, fricked colors, fricked sharpness for text, and so on. Literally only 40 year old "gamer dads" would even think of using a TV for their PC.
I like it. Text sharpness is nice and helps for nip stuff.
I still do 1080p and don't intend on changing any time soon.
>700-900$ for a monitor
That's how much my entire rig cost before I recently updated the gpu, no thanks.
I got bad eyes and with the distance my screen is at I barely notice any difference between 1080 and 4k.
don't need strong hardware for 1080p so that's a real blessing in disguise.
also if you're used to 60hz, DO NOT TRY 120hz, you will not be able to go back, just stay ignorant and save processing power.
Unless you have a 3090Ti you will be pissed off you're not getting a consistent framerate.