Is HDR just a meme?
This phonograph "reads" a rock’s rough surface and transforms it into beautiful ambient music pic.twitter.com/PYDzYsWWf8
— Surreal Videos (@SurrealVideos) March 3, 2023
Community-driven video game blog & discussion
This phonograph "reads" a rock’s rough surface and transforms it into beautiful ambient music pic.twitter.com/PYDzYsWWf8
— Surreal Videos (@SurrealVideos) March 3, 2023
Right is better thought
>Thought
american education system
Though is better although
Maybe he's not American, maybe he is a guy from south America, who learned english all by himself, and by that he knows more language than you, Mr intelligent. Did you think about that, prick ?
They didn't make a standard when they released HDR. SRGB conversion and white point can make one game look good and fuck up another.
>its brighter so it must be better!
OP = gay
first post = best post
Left seems more atmospheric
It's a meme if you don't have a 1000nits display
True
>everything is blue instead of dark grey
>WOW WHAT AN IMPROVEMENT
t. Zoomer that grew up with piss filters instead of Crayola 64
I've never seen an actual HDR monitor, as in 1000+ nits, proper color calibration, many dimming zones or oled, etc, so I couldn't tell you.
Watch youtube HDR videos on your phone.
I have 4000nits display 1400dimming zones and hdr is great.
That's what people tell me.
It seems to be quite the game changer at the high end, but anything cheap advertising itself as "HDR" is bound to look like shit, like
pointed out.
>4000 nits
bro, you are going to burn your eyeballs
Yes, because it's not on Linux.
It kinda is if you are willing to jump onto some wip stuff
It (barely) is though, if you don't mind jumping through a lot of hoops.
What's the deal with this? Is it some hardware DRM shit or is it just not readily available?
The code is not readily open source due to licensing, it conflicts with Linux open source nature. Think of how HDMI 2.1 VRR is a bitch to get working on Linux while it's simple on DP.
>HDR10 is an open and royalty-free standard
I don't get it
It's open but it doesn't mean it's easy to implement. there's still licensing to go through.
"Hardware" DRM was the wrong term, but I figured it was some shit like this.
Basically seeking a standard between a bunch of companies is hard, and until recently HDR monitor were a big meme. I think the main players trying to get HDR working currently are valve and red hat.
Works on my machine. But refresh rate is locked at 60hz on gamwscope-session and I can't figure out how to let me change it.
>X is Y but I can't figure out how to change it
sounds like everything in linux
Kinda depends more on how the game implemented the HDR. Tbh rarely any games use it properly except for a few like the dead space remake. Playing that game on a 4k oled hdr TV is rpetty kino.
It's a meme on non-oled displays because it'll wash out everything and make light bleeding 1000x worse.
I have genuinely no idea what display HDR is supposed to do.
It makes bright colors and lights much brighter. You can't really see it without an HDR compatible display but it's very noticeable with light sources, it makes them really pop out in a way a non-HDR display can't. Games still look fine without it though.
I have a cheapish tcl tv but the first game to make me notice it was anthem, the absolutely bright explosions made me go "wow"
The HDR display (NOT to be confused with HDR graphics setting), are meant to able to display a higher range of colors and help to avoid many darker or bright scenes from being complete blown out, losing much details in the process.
Basically dark scene won't just look like Doom 3, and you actually seen objects in bright scene instead of just being flashbanged.
bit daft that video.
What it doesn't explain is that he's compressing the dynamic range of the brightness to compensate for the limited brightness of his SDR screen, which is around 350-450 nits.
HDR is doing the opposite. It doesn't need the dynamic range compressed. You're getting the full dynamic range of blinding bright specular on black backgrounds.
>What I'm doing is to first compress the maximum brightness and then switch back to the normal video
what? You're still limited to the SDR maximum brightness. All that video is doing is blowing out the contrast.
Look at a real burning candle and then look at a glossy magazine photo of that same candle.
They may look alike, but one has a high dynamic range while the other has a low dynamic range.
It seems like you don't really know what's going on.
The video is very clear and you're making shit up about the content.
>You just don't understand...
I understand more than the both the idiot on the video and you, apparently.
>I understand more than the both the idiot on the video and you
>You're still limited to the SDR maximum brightness
SDR maximum brightness is pure white.
SDR minimum brightness is pure black.
How that looks like is dependent on your TV/Monitor.
The difference between ''HDR'' and ''SDR'' are the brightness steps inbetween. 10bit vs 8 bit.
The video isn't claiming to make a high contrast OLED out of a shitty LCD and it isn't demonizing OLEDs.
It's about the HDR marketing label, that many (you) get confused by.
>SDR maximum brightness is pure white.
>SDR minimum brightness is pure black.
A 200nit screen showing a full 100% while level isn't going to have a white bright level of a 1000nit screen showing the same 100% white level. Try it yourself on your monitor. Turn your brightness to 100% and then to 20%. Are they the same? No. So how is a 200-300 nit screen vs a 1000nit screen going to be the same?
>The video isn't claiming to make a high contrast OLED out of a shitty LCD and it isn't demonizing OLEDs.
It's about the HDR marketing label, that many (you) get confused by.
shut up you fucking retard.
>The difference between ''HDR'' and ''SDR'' are the brightness steps inbetween. 10bit vs 8 bit.
also how those steps are distributed, linear vs log
in SDR if you brighten the display up, everything gets bright and your eyes will fucking hurt if you look at it, in log when you brighten shit up the highlights gets bright first, but rest of the image stays still within a watchable range
yeah
>A 200nit screen showing a full 100% while level isn't going to have a white bright level of a 1000nit screen showing the same 100% white level. Try it yourself on your monitor. Turn your brightness to 100% and then to 20%. Are they the same? No. So how is a 200-300 nit screen vs a 1000nit screen going to be the same?
dude
>How that looks like is dependent on your TV/Monitor.
I think what you're confused about is perceptual brightness. You're almost right but you're missing a crucial detail.
"How that looks like is dependent on your TV/Monitor"
Yeah... but also no... The point of HDR is to stop that happening as much as is reasonably possible. The idea is to STANDARDISE brightness. (Except we created like 4 standards of HDR, but let's put that aside for now.)
You are right about SDR: min brightness is as dark as the TV can go, max brightness is as high as the TV can go and what that range is depends entirely on the TV. But only on SDR TVs. HDR TVs do not do this.
With HDR this range was given a standardised range of values so that SDR max was defined as say 250nits. If your TV can go to 500nits, SDR max would STAY at 250nits unless you fucked with the settings to blow out the contrast. HDR gives the source material more headroom to go beyond 250nits. To go brighter than SDR max as well as improve color gradients within the regular range.
The confusion happens because technology isn't there yet to display the full HDR range and so both source and display have to compress the range to what they can display and this leads to highly inconsistent results. There are professional monitors with fan assisted heatsinks that can display the full HDR range for extended periods of time but it's somewhat pointless to master your content for a display 99.9999% of the population can't have. Instead you do a "best effort" to match the HDR standard you're mastering for, but test it on various common TVs to make sure it doesn't look like shit.
It also doesn't help that people fuck with the TV's display settings until they blow out the HDR gamma curve so that it looks dimmer than their fucked up SDR curve, but consumers gonna consume.
Pure black is simple, but what does pure white means? Is it just pure white colour, no addmixtures from yellow or blue e.t.c. or does brightnes of that white comes into the pure-ness or what? Is it still pure white with low nits?
>Is it still pure white with low nits?
Depends on your eyes. I'm sitting in a dark room and set my monitor brightness to 0 and the textbox sure as fuck is white until I turn on the lights.
I think the DV spec at least is designed with 1000nits in mind, with lower peak nits resulting in reduced gamut
everything in that video is wrong
hdr is simple, light brightness can go to infinite, to the point you are staring at a laser that will fucking burn your eyes out
in normal Standard Range brightness is capped by the camera sensor, in HDR just allows bigger range of brightness that allows to reproduce effects of bright objects
obviously you dont want for your tv to make your blind or set your room on fire with te brightness of the sun, but the current range of displays is very low, and only recently started to get better
>everything in that video is wrong
You don't contradict a single thing. Camera tech is outside of the scope of the video.
Misleading picture.
Both traditional SDR TVs like CRTs and HDR-Compatible Displays like OLED are capable of displaying pure black, let alone night skies and full moons.
HDR just turns the contrast up to 100 and that's it lol what a scam
with SDR white sunlight is the same colour as white paper.
with HDR white sunlight is bright and paper isn't
For eg. Elden Ring on a 1000 nits display, the little glowing magic spells actually can dazzle you and look like those bright LED lights. In SDR they look like blue blobs on the screen.
>with HDR white sunlight is bright and paper isn't
but your eyes will adjust to the sun and you'll not see the paper white anyway
Found the retard
complete meme on anything but self-emissive TV's
every monitor (ips, VA, TN) that advertises it is just for marketing
even the new OLED monitors are just too dim to be worth using
>Get nice Samsung 4k TV with HDR to use as monitor
>dim as shit
>HDR works but is hardly noticeable when playing games
>had to upgrade windows to even use it in games
>awful fucking backlight bleed ruins the display within a year
>Get nice Samsung 4k TV with HDR to use as monitor
You bought a shit one. Not a nice one. idiot
>bought a shit one
ya no shit idiot thats what my post was about
>I bought a nice one
>but I meant it was shit. You just don't get humour, lolzers
Came here to call you a moron
>dim
I wonder what might be the problem they have fucking 1500 nits tv for 1k lmao, how could you fuck up
People often saying "nice" or "expensive" fail to understand that not every tv or monitor has actual HDR features. Most of them have fake HDR where they brighten the whites and contrast adjust the rest, making it look like washed out shit.
>judging hdr by screenshots
IQ measurement required
Anon, I... You do realize that to actually judge the difference you need a HDR capable screen (and a good one at that) and HDR content, right?
He's talking about game hdr not screen hdr.
You need a hdr screen to see hdr in game tho
while true, the colormapping of game hdr is often fucked up which OP is making fun of
why do zoomers hate shadows and dark colors
Night is scary for a lot of children.
it's excellent on oleds
My favorite use of HDR has to be in this collection of movies. It has so many shades of black, white, and gray that it increases the detail to unreal levels.
Newly made movies like Puss in Boots look really cool in HDR as well.
As for games, there's definitely a difference, but it really depends on the game.
Windows 11 has an AutoHDR feature that seems to make my eyes hurt after awhile
cyberpunk 2077 on an HDR display is another good showcase of the technology.
Touch of Evil 4k is brilliant along with Pyscho along with double indemnity, citazen kane, saboteur and shadow of a doubt. B&w looks amazing in 4k hdr.
I'm about to pull on A95K just to watch kino in 4K HDR. Those old movies were shot on 35mm film which has real resolution of ~4K. So basically there was no better time for a home cinema as you can actually experience movies even in better quality than when they were shot.
>not going for qn90b
Ngmi
It's okay I have a job.
>Those old movies were shot on 35mm film which has real resolution of ~4K
I thought the real "resolution" of 35mm was considered somewhere around 8k
In practice it depends on a bunch of factors, including the film stock, the lenses, processing and so on. Technically it probably has a lot more than that under perfect conditions, but in practice, and especially for older movies, once you get past 4K the biggest difference you're going to get is sharper grain, not so much actual image detail.
Well, that makes sense.
I bet those look incredible in HDR
>grey filter VS blue filter
I notice a big difference and I think HDR usually looks much better. only time I had to turn it off was for resident evil village. no matter my hdr setting that game always looks either too dark or too bright with hdr.
>Buy monitor for 320 bucks
>Is capable of 2K, HDR and 160 fps
>Switch to 160 fps
>Nice
>2K is always on
>Active HDR
>Somehow every color is fucked in Fiorefox and Windows
Somebody knows why?
Everything under 600-700 nits will be total shit with hdr, and you absolutely cant buy a monitor brighter than 300 nits for 320 bucks
I bought this one for $200 two months ago and it's 400 nits. A similar quality LG monitor is on sale regularly for the same price.
https://www.amazon.com/dp/B0949KL83T?psc=1&ref=ppx_yo2ov_dt_b_product_details
Fucked up how? Like washed out? That's fairly normal when activating windows auto HDR. Just shut it off and turn it on when you're going to play a game. You can press Windows key+alt+B to toggle it on and off.
>Like washed out?
Yes
>Just shut it off and turn it on when you're going to play a game. You can press Windows key+alt+B to toggle it on and off.
We live in 2023 why cant Windows adapt to HDR? Is this a technical problem or what do i miss here?
Are you really surprised of Microsoft bring incompetent
>Somebody knows why?
Yes. HDR and SDR have different colour profiles. You're only meant to turn HDR on while watching HDR content. You can adjust Windows to sort of match SDR brightness but the colours won't match up and it'll look odd and you'll see extreme banding etc.
Turn HDR off until you play a game in HDR, then turn it on before you launch the game.
playstation 5 has an automatic turn on when supported feature.
Some games also have that on Steam, but some don't. Returnal and the RE4 demo require you to turn it on beforehand.
Yes. But I'm replying to a post talking about Firefox browser running in Windows
Wild hearts starts in SDR even though that 'on when supported' mode is active on my ps5, and then when I switch it to HDR it's much worse. Had to disable it for that game, they fucked up.
Post your monitor model. It most likely doesn't have HDR. It's like claiming 4k resolution but you just stretch the image.
HDR doesnt actually work for sub-1k € monitors because they lack contrast & brightness and you're better off just using sdr on those.
Current operating systems don't support HDR, you have to enable it only before launching the game or watching a movie. Yeah, retarded.
You need to calibrate your monitor. Make sure you have the latest drivers installed for your specific monitor and find the Windows HDR calibration app and use it
HDR is indeed a meme and is just there to brick your TV faster. consume.
Not really.
On a purely technical term, HDR is a real thing. The implementation is questionable.
ligma test
The only meme is SDR and nothing changes that fact, not even the poor turdworlders here screeching about it.
you pretty much need an Oled or the most expensive VA panel you can get to see it. This tech came out prematurely. Most Tv's nowadays support Hdr but can't actually display it. That shit is only there to keep prices high.
In theory HDR is good, but in practice it's a meme. For one, there's no real standard for it. Like 95% of shit advertised as "HDR capable" is fake and gay and not actually HDR at all. Then there are the problems of implementation where it's not standardized so it totally comes down to the individual game/movie/whatever whether it's implemented properly or looks like shit. Then there are issues like how HDR on any form of LCD is a fucking joke because local dimming zones are required and fucking suck. Which leaves OLED as the only real option, but OLED sucks even more unless you're a dumb goyslop consoomer who thinks it's totally cool to pay a couple grand on a screen just for it to develop burn-in a couple years later and then need to go out and buy a new expensive OLED all over again.
What HDR monitor do you own?
>Burn in
You could have just said you're a complete retard at first, I wouldn't have bothered reading this shit.
Yeah, anyone who uses the terms " goyslop consoomer" is a retarded coping poorfag
you're right, technically it's organic compound degradation.
Sir, fyi full array qled was invented
i watched mulan in 4khdr the other day and that shit was kino
No, but your image is
HDR allows white light to look like it does on a CRT
no, its actually a great improvement, more noticeable than 4k
Yeah no
>vasline games
I turned it off on my monitor because everything looked blown out and bright. Especially reds. I tried to adjust it but I don't really know what I'm doing when it comes to color and stuff. It looks fine in sdr to me. Could just be my monitor has bad hdr I don't know.
Nah, if it’s blown out bright it’s probably just not calibrated well
Bad HDR is dull mostly
Your monitor is probably ~400nits, which is about a third of what you need for HDR to look good. Honestly, it's not even worth calling it HDR.
Good HDR monitors cost a couple of thousand.
No
>Devs and movie maker have full control on how the image will look
>Strap a dozen filters on top to improve it
I'll never get it
Hdr is not a filter?
HDR is cancer
Good graphics are a meme
in my experience on most games it looks like shit
but on demos eg https://m.youtube.com/watch?v=n3Dru5y3ROc hdr looks good and sdr like smeared shit
no, but no gaming monitors can do good HDR
By no gaming monitors I’m going to assume you mean yours doesn’t
Can monitors really go higher than 700 nits
yup
Holy fuck
>no certification
it's over.
>43" monitor
>can't tilt
How do you even use that thing? Do you move your head the whole time?
no you dumb fuck, IPS cant do hdr, 99% of top end gaming monitors are IPS or VA
you got like 5 oled monitors that will burn in, and will give you eye cancer from all the ghosting
The issue is that there is no standard or presets to adjust the HDR to the artistic vision. So you are basically lost in what should be the proper look.
the even bigger issue is that if the game dosnt render in hdr space then then some gimmicky real time conversion will be about as good as those shitty post 3d movies we were fed 5 years ago
most games do linear space rendering both for accuracy and for better framerate
you know they just keep making shit up so you buy new product right?
Hdr doesnt work like that. You also cant really showcase hdr on sdr display anyway. Dolby Vision on OLED is amazing. I love watching old movie 4k conversions with proper hdr
I thought it looked good with RE7 on my gigabyte G27Q but every other game seems to look worse with it on, even other RE engine games.
If they want it to have the colors on the right they can make it have the colors on the right.
It’s an image trying (albeit unsuccessfully) to explain what hdr does so people with an sdr monitor can understand
But it does it completely wrong, think of HDR like being able to see light more realistically
Yes, way bigger impact then RTX.
The problem is that even mid range monitors really suck at it, so "poor" fags think hdr is a meme.
1k+ tvs/monitors make it shine, and if you disable it you cant imagine that you used to be okay with non hdr.
It really is day and night and its not like op's picture at all. Thats over saturation, thats not whay hdr does
i must be really dumb,
because i don't get hdr.
yes, colors are more saturated, but that's it.
but it doesn't look better, i have watched guides
that to actually make it look good, you have to calibrate the hdr.
i did and still looks, normal + saturated.
but my ps4 pro, was running hotter, and sounding like a jet.
the only one that i can say, made a improvement was shadow of the tomb raider,
because it made the dark part, less dark.
but isn't raytracing trying to do the same, but better?
No, not really. HDR is beter colour, rtx is better path of rays from lighting object and reflections
>i must be really dumb,
>because i don't get hdr.
You might be, but it's also just as likely that your display is crap and/or completely off calibration in both SDR and HDR. Also not all developers have completely figured it out yet, and in some cases they've been using duct tape to attach it to legacy engines that weren't really designed for it from the ground up. The result is that it's still the wild west in how and how well HDR is presented across different games.
Here's my attempt at an HDR for dummies explanation:
HDR is not about everything being brighter, it's about being able to be brighter when it needs to.
HDR is not about everything being more saturated, it's about being able to be more saturated when it needs to.
In an environment where all the variables like display, game engine, room brightness and so on were properly used, controlled and calibrated, an HDR and SDR image of the same game should have a fairly similar base image, but any strong light sources would be brighter and show more detail in HDR, and any strong colors, like neon lights, would be more saturated and show more detail in HDR.
>Colors are more saturated
Then you are using a shit tier hdr panel that pumps shit up to compensate for how shit it is.
You have a shit monitor.
I owned one of those 4k 2.5k expensive monitors and its hdr was fantastic, but I scaled down to a midrange 3440 monitor. It was still $700, but the hdr was still fucking trash.
You need oled or a fald display, then you know the brand took hdr seriously. Otherwise its just as simple "the monitor understands the signal"
No. But you have to have a decent display and proper calibration for it to look good. Most monitors blow at HDR except some of the newer OLED ones.
It's hard to really represent since you have to see it for yourself. You can take HDR screen shots but they are in a specially format.
None of those images are in HDR, you just upped the contrast abit
Every game since HL2 is HDR internally.
HDR screens just allow the games to represent the internal rendering better.
Thinking if someone made a gif sliding levels in photoshop around on a HDR image and SDR image it would explain to everybody what HDR is.
Since the HDR image would be as detailed no matter the brightness, but SDR would get overblown or turn into a black blob, im surprised nobody does the comparison like this.
heres an example, only need better quality and normal image under it.
You still need a HDR monitor to actually see it, so no other way to show it to nay-sayers but on an actual good HDR screen.
nah, you wont have to, since you will be showing it in slices in diffrent frames, in HDR every frame will look diffrent and detailed like in that gif, but in SDR every frame will look the same only darker and brighter to the point the compression will kill all detail
not pefect representation but will show off the idea of logarithmic range
That can be achieved by changing brightness and contrast and other settings, if you can recreate that on a non HDR tv its not a HDR. You are just showing a ugly sdr settings next to a better looking settings
>That can be achieved by changing brightness and contrast and other settings
but it cant be
you dont have the range of information to reproduce the effect with just contrast
since 0 to infinity of log gets baked down to 0 to 1 rangeof linear, you lose up to 99,999999999999....% of information, you literally dont have it anymore in a normal image
here
i took a hdr image, exported is as png, imported to same file and changed exposure, see how much detail was lost, it only gets worse the darker the image gets
i could convert it into a gif, brighten up a bit, and since any display can do 0nits colors even if it cant do 1000 nits you will be able to see the loss of data even on a shitty ips that has 30%rgb color range and 200nits
maybe i will do such gif for every time hdr gets brought up, but im kinda lazy to export 20 images one by one and importing them as animation
>any display can do 0nits
uhh
HDR in photography and HDR as TV term are different things.
Top looks like everything was photoshopped in lmao
yeah, but they both use log color range to store the image data ( assuming you are viewing real and not fake hdr )
hdr images are the best way to explain hdr to somone who dosnt have a hdr display, they allow you to show on a sdr display how the extra detail in hdr works
>HDR in photography and HDR as TV term are different things.
It's the same underlying idea though, of having access to and displaying a greater range of image data/detail than would normally be possible. The major difference is that with HDR in common photography you use the greater range you get from the different exposures in a very fake/photoshopped way to try to present it all on a regular, limited monitor. Whereas with HDR displays you don't have to fake it, because the display is actually capable of showing that greater range natively.
Can someone explain where all the orange comes from? None of the source pictures have that.
The orange is pure fantasy/photoshopping and has nothing to do with HDR.
then why do I see more orange when I turn HDR on
In what context? Some random game? Could be different things:
- In your case the orange may be part of the original data, but it is being washed out slightly by the conversion to SDR, so in HDR mode you get to see it as it should be.
- The SDR and/or HDR modes on your display are not calibrated or close enough to reference.
- The game handles HDR improperly.
In that specific photo however it is 100% faked, someone just thought it looked nicer that way.
Enabling it via windows adds a very noticeable orange tint, recent games I've tried with it like DMCV and mass effect LE have it but not as bad. I assume it's just my display, I've messed with the settings before but didn't really follow any guide, just changed everything until I liked it
In that case it's possible you have different white balance settings in SDR and HDR modes. Look around sites like rtings, they often post calibration settings on their display reviews. You should never copy any of the detailed settings where they mess with specific RGB values, but it might be helpful to get the display to a neutral baseline if you're not that familiar with what everything does.
Do I need the ICC profile too? I just tried to install it and it's not showing up in color management when I add it.
If you don't know what you're doing I would avoid complicating things further with ICC profiles, I'm not super familiar with monitors as opposed to TVs, but I would try to stick to the basic settings that can be changed on the monitor itself.
HDR is good but 90% of monitors are fake-hdr so people think it's some placebo
I got a Samsung QN94A and it looks amazing for HDR. In games though it has a local dimming issue that dulls the colour to keep blooming under control with bright objects in dark areas and it’s real fucking annoying. I’d like an OLED but i hear they have their own issues to some degree. It’s a case of picking your poison with TV’s and gaming ATM. Even this years TV’s ain’t perfect for games.
Is HDR 400 worth using?
It's completely up to you, some people don't like it, some people think it's still a visual upgrade. If you already have a 400 monitor then just try it and see for yourself.
I have a LG C1 and its a huge deal changer. I pretty much play everything I can in HDR.
HDR is great and will be the standard in a few years but there still isn't a monitor or TV out there that can display no compromise 1000 nit HDR. FALD screens can go bright but have blooming and dimming zone flickers, OLEDs have the pixel perfect contrast but can't go over 500 nits on a decently sized area.
>Those low Hz
My condolences, for me that is one of the most important stuff, high frames
>60 Hz
Holy poorfag moment
>muh framez
Yeah I bet you're a pro CS gamer too
I haven't played in 60 Hz for a decade now, stop coping
I have a C1 lmao
So get yourself a HDMI 2.1 cable.
You do know with that GPU you could get most games runing at least 75 hz or higher
I'll stick to 30fps if I can max the graphics instead
I had that cope on my old laptop too, minus the max graphics
LG C2 literally does 1000 nits.
iirc it hits 700 max in tests, G2 gets closer to 1000 despite using exactly same panel because it's overclocked to shit and has heatsinks everywhere
youre right, i was misremembering
yeah for 5 seconds in a 5% window before ABL kicks in and gimps that in half.
To be honest, it might take a few years for OLED to make full use of HDR. While flagship models are already good for HDR movies and gaming (peak brightness 1,000 nits+ this year hitting 1,300 nits at 65D), all OLED products still have flaws:
>WOLED
color luminance is still too low, even with MLA. BT.2020 range is still below 80%.
>QD-OLED
near blacks are bad: EOTF tracking either crushed or lifted, color rather greenish than black and dark gradients show banding
Next step will be blue phosphor that might drastically increase brightness. Until RBG OLED will be viable for screen sizes bigger 48 inch, which will probably take longer than the mass production of microLED or at least real QNED, OLED will hit a wall probably sooner than later.
I'm a neet btw.
How many monitor have true hdr right now. Like maybe 5 right?
HDR on a 4k Oled TV is a fucking amazing gaming experience
For 90% of games yes. For Uncharted 4 and Forza Horizon, no.
I don't know what HDR and SDR even mean.
Yes, since I can see the difference on a non-hdr screen.
I got an Sony x900e and games looks much better with hdr off. Probably because my tv is old? At the time it had good ratings.
I can't see any difference between 1080p and 4K, are HDR or OLED similar meme features that do nothing?
>I can't see any difference between 1080p and 4K
I genuinely think you need glasses
I thought so too but I've gotten my eyes checked multiple times and I always have near-20/20 vision. I got glasses a time ago because my eyes hurt when reading sometimes but never use them. I've seen 4K tvs and I just don't know what is added. To me 1080p doesn't look blurry or anything, it's not like there are flaws that need to be fixed, y'know? So I'm not sure what improvement I'm supposed to be seeing with 4K.
Image should be sharper, and more details, while 1080p has bit softer and not as sharp image, lets say if you look at someones face on 4k you can really see the smallest detail
But if I look at someone's face in 1080p it looks the same as in real life. Unless like the image has been compressed or something. There are no details missing that I can notice.
And if you squnt abit their faces look no different as in 480p, so why even go higher right? Lol
Because I don't squint in real life?
Start squinting and 240p will look like real life
Yeah and? I don't understand what you're trying to say. Why bother going to a higher resolution if the eye and brain can't receive/interpret more visual information?
Your eye and brain maybe cant, i can see the difference and its decently huge
Optometrist, now annon
The difference between 1080p and 4K is entirely dependent on your screen size/seating distance ratio. If you sit too far or your screen is too small, you're simple not going to be able to appreciate the increased resolution no matter how good your eyesight. HDR and OLED however do not really depend on that in the same way, perfect black level and greater contrast can be enjoyed regardless.
Granted you may still be one of those boomer/zoomer types who can't tell the difference and are perfectly fine streaming movies in awful quality in 12 parts on Youtube, but if you have any standards OLED and HDR is a huge improvement.
It's just really hard to compare SDR to HDR without looking like a retard or lying, since the people you need to convince still have SDR and thus you can't truly show them the difference. It's the same as how when you find "60hz vs 144hz monitor!" comparison videos it's usually like, 24 or lower fps compared to the same animation in 60fps, since if you compared 60 to 144 on a 60hz monitor they'd both effectively be 60fps.
Dude, if you cant notice the difference between SDR and HDR you're either need to see a doctor or your monitor/TV is garbage. The difference is massive on my c2 55" OLED
Please post your IQ and native language so I can correctly address your reading comprehension issues.
It's really funny how poor most of this board is, as big of a loser as I am at least I have a woman and am not as poor as you lot
>at least I have a woman
My condolences. I'd rather be poor.
HDR is only worth on OLED/QD-OLED screens.
>Bought HDR compatible monitor last year
>turn HDR on
>everything is dark as fuck
>screen blinks when I alt-tab out of a game
>turn off HDR
>everything looks fine
>HDR monitor
It happens because your screen is garbage for HDR. 99% of "HDR compatible" monitors just try to simulate HDR and they dont have enough nits to do proper HDR. The only screens that can do proper HDR are OLED/QD-OLED ones.
How to get the proper HDR experience as a better by a ex Brandsmart TV store employee expert myself:
>Buy a TV that has at least 1000 nits of peak brightness (LG C1 OLED for example or high quality Mini LED from Samsung)
>Calibrate the TV settings according to RTINGS.COM
>Run Windows HDR Calibration tool from Windows Store if you're using a Windows PC it is a requirement
>Play in a room that is well lit, not totally bright but not totally dark
Congratulations you have the peak HDR experience, everything looks like real life now.
Isn't c1 800nits?
OLED can get away with lower nits because they have more dimming zones than other displays and every pixel can individually control itself. It's a perk of the tech but also a drawback as that's what allows burn in and wearing out colors.
C1 55'' has 800 nits and c2 55'' has 850 nits.
this thread reminded me to calibrate my new monitor for HDR
thanks anons
Hi guys let's make a definitive list of good and bad HDR games? I've only played a few. I'll start
>Good
Uncharted 4
Dead Space
Modern Warfare 2019
Doom Eternal
Forza Horizon 4
Forza Horizon 5
Ori Will of Wisps
Halo Infinite
Mass effect andromeda
>Bad
Cyberpunk 2077
Red Dead Redemption 2
Spiderman
God of War
Assassin's Creed Origins / any Ubisoft game
Devil May Cry V
Resident Evil 7
Resident evil 2 remake
Hitman
Uncharted Lost Legacy
Wo Long
Ghost of tsushima and GOW ragnarok HDR are pretty good.
RDR2 is not that bad if you turn on the "game" visual profile, which unlocks more range. The RE games, at least after the PS5 upgrades, are way better than regular SDR. DMC5 still kinda fucked though. The Decima games have pretty good HDR as well.
>Good
Returnal
latest Gears of War, especially the expansion
And Im pretty sure Cyberpunk and RDR2 got fixed long ago. Atleast for Cyberpunk I played through it with HDR and it was good 6 months ago.
RDR2 looks fantastic on PC
What exactly makes a game HDR good or bad? Most of the time it boils down to user preference.
technical implementation
You have to elaborate further with that. It's just a on/off for most games while others they have settings the user adjusts. Games with setting are not always good either.
On my Samsung QN94A it can goto 1500nits in a game. So in SDR looking at the sun or a fire is bright but in HDR it’s super bright and still had dark details that aren’t blown out. It’s like having a max brightness flame on a wall right next to a dark doorway.
Cyberpunk HDR is legitimately bad on OLED because they got the black values wrong. You need a mod to fix it.
>What exactly makes a game HDR good or bad?
Good HDR is when things light sources, fires, explosions, effects and specular reflections are much brighter, more saturated, and show more detail than in SDR, without making the rest of the image brighter as well, i.e. you want as much contrast as possible between light and dark. Preferably without having to mess around with a bunch of sliders to achieve it.
In some games the brightness increase on these parts of the image is very small, so you don't get much of an HDR effect. In some cases, you can make it go brighter, but it also makes the rest of the image too bright as well, so in the end you don't gain anything. You would get largely the same effect playing in SDR and just increasing the display/monitor brightness. Sometimes you get a good contrast but things like lights reflecting on wet pavement like in
doesn't actually show that additional detail you want, it's just a brighter blob of largely white. It all depends on the game engine, how many corners they've cut, how SDR-specific the pipeline is with their shaders and so on.
Why is Uncharted 4 there? It doesn't have any HDR settings and just outputs up to 10.000 nits
The expansion fixes it though and looks much better.
Uncharted 4 (PS5/PC)
Resident Evil 7 has the best HDR configuration. It has like five different ways to calibrate the gamma and brightness. I wish more games had this.
Put Ori in Good btw, that lil bro blinds my retinas in a good way
it can look good in specific situations, like physx when that first came out. As long as devs don't go full waxed mall tile mode, raytracing is nice.
Sea of Thieves has amazing HDR. At night on the ship with all the lights off in the cabin is pitch black then you turn on a candle illuminating the room perfectly. Also the fireworks are really cool and vivid.
>tfw only 350 nits
Its over.
>that backlight
Is it me or hogarts legacy HDR is pretty bad even on a c2 OLED?
Im on a c2 oled too. Hdr's shit on that game.
Yes.
HDR is a meme yes.
Dolby Vision is not. Makes the image look much better. Sony needs to support it like MS sooner rather than later.
I think it's a corporate scheme to make streaming and video captures look like shit.
HDR10 in display allows vidya to use wider tonal range. Of course it's not a meme, it's obvious how it works and why it's beneficial. You want your bright day sky to be physically brighter than the ground it illuminates, you want sources of light to be significantly brighter than dark city without losing any detail both in darkness and in bright ares. However, vidya implementations are all over the place, some games look wonderful, some are so broken you're honestly better off disabling it (AC Origins).
Got meme C2 recently. Damn, i sure missed those real blacks, it looks like entirely different image quality, and once you pair it with bright HDR lights in vidya, the result is just insane despite the fact that C2 isn't even all that bright for 2023 flagship standards.
You don't need 1000 nits when you have infinite contrast, that's why it still looks great despite those OLED barely hitting 700nits on a small window on a good day.
It's a matter of contrast. Those people claiming that 600nits is too low for HDR don't understand the whole story.
Those TVs with a backlight need to hit >1000+nits for a similar HDR quality to OLED because they need the contrast to be higher between black and full white light. Despite the higher peak brightness, they still cannot beat the perfect contrast on OLED.
They have FALD systems to create blacker black, but that also create input lag for games, that's another point in favor of OLEDs if you want HDR gaming. HDR game mode on those TV/monitors typically also have lower zone processing so that's more blooming and less contrast. And FALD are fucking expensive too. It's just a bad idea all around.
what's amazing is you can view the hdr screenshot on your normal screen. incredible right
As long as there is no pure black to display, it's possible.
It is because the average fags experience has been ruined by fake/crappy hdr that tv and monitors have been allowed to throw it on whatever.
It just ends up making things look grey and washed out instead.
Those who get the real experience know that it looks nice.
I have an hdr display but how do I use it? There's a windows setting, but does that fuck up content thats not hdr? Do I go to my monitors settings?
If you're using Windows 11 download the Microsoft HDR Calibration tool from Windows Store and run it. Then just turn on HDR in your display settings. SDR and HDR should look great and Auto HDR should enhance old games.
Turning HDR on forces my monitor's brightness to be 100% which is fucking blinding. I keep it at 30% at all times.
This. God forbid you play a game that does full screen flashes like counter strike.
That's how HDR works, the display is set to 100% brightness, so that brightness can actually be available for the brighter pixels that need it. But what should happen is that everything else that doesn't need to be super-bright, like the desktop, should have much lower values in the video signal to compensate. Optimally, on a properly-working and calibrated setup, switching between HDR and SDR should have no major effect on things like the desktop or interfaces. I don't know how HDR is in Windows now, but it was a fair mess a few years ago so I mostly avoided it. It's so much simpler on newer consoles/TVs where everything is designed around it.
>"HDR is a meme"
>"4k is a meme"
>"Raytracing is a meme"
when will poorfags stop complaining? It is 2023 ffs
raytracing while technically amazing is a meme
Raytracing is the same as HDR, it depends on implementation. Cyberpunk is amazing with raytracing. Dogshit like Control is terrible with raytracing.
Yes I started gaming on Xbox instead of PC because of HDR on my oled TV being that much better
HDR is always shit when I enable it in games. Do I gotta enable it on both my monitor settings and in game to balance it correctly or something or is it just shit on PC?
its just shit on PC. HDR is unironically better on console
It works fine on Windows 11 you just need to
>windows 11
no thanks
Then no HDR for you on PC
There's no reason for that configuration tool to be locked behind 11
windows 10 can't do hdr properly. its not the tools fault. wddm is older on Windows 10. upgrade your depreciated OS.
guess i'll have to use my windows version Xbox UI for perfect HDR and keep my PC clean of all that shit that is windows 11
It's largely due to hdr being shit on monitors due to lack of standard enforcement and reliance on OLED, which I think has production issues at monitor size among other issues
Well I plan to grab a new TV and a Series X this year for Gamespass basement play so maybe Mass Effect LE will have tolerable HDR that way.
do it, a nice TV and series X makes games look amazing in hdr
TV I'm replacing has been there 13 years so saving for the LG C2 but that 65 incher is expensive and I got a new couch on my CC as well.
It IS good, but only at the highest end, so it's not yet worth it.
Spending money on 400 HDR shit today is like the first fucker that spent like 5000$ on the first shitty plasma tv, or the first 4k TV.
The only constant in tech seems to be that no matter what, early adopters always end up getting fucked.
both look horrendus to be honest
I thought so until I experienced it on my LG C1 OLED TM
Yeah if you're a poorfag coping with a cheap display
HDR and Atmos are only a meme to pc retards because they never have a good implementation. Meanwhile, my large 4k tv and 5.1.4 setup shits on all of your /r/pcmasterrace setups.
Ofc a console only moron is to stupid to realize you can just use a nice modern OLED TV for a monitor and enjoy nice HDR.
based. PC cucks in shambles with their poor HDR implementation
Can't use freesync and HDR at the same time and I'd rather use freesync
I can. But my denon receiver doesn't work with VRR even though it says it does. And I would rather have surround sound then VRR.
Dunno. I have never saw true HDR
I always thought HDR is incredible, even in it's rough early stages with games like Forza Horizon 3 the way the night sky and taillights popped was so cool to me. I really dont see the issue people have with it. It's a far more readily noticeable upgrade than 1080p to 4k at first glance.
HDR is great. The problem is there are a bunch of shitty IPS and VA monitors that advertise HDR support without actually being able to produce the brightness levels and contrast that are needed for HDR.
That picture is retarded, you can't see HDR if you don't use an HDR display.
>what is HDR photography
Not the same.
HDR photography capture details that would be crushed in the blacks or whites because the sensor wasn't sensitive enough.
But you need an HDR display to show HDR content, that is content encoded in an HDR format with higher bit depth than normal displays, otherwise you would just crunch the details in the picture/video if you use an SDR display.
On an OLED no it's fantastic
You know those Capcom RE engine games with their washed out colors and low black level? On HDR that's completely fixed.
Are CRTs HDR?
no, as they're not processing 10bit image data and shit. CRTs are just SDR with near perfectly black blacks
They display whatever you feed them throught the analog cable, so with pic related something better than SDR could work with professional monitors.
Even if the signal is analog it still needs to be sent in a certain way, you can't just crank it up beyond specs and expect the display to act accordingly. I don't think there's anything that would technically prevent one from creating or perhaps modding a CRT to be capable of some kind of HDR, but I imagine the biggest problem would be to do an electron beam that could light up the screen with enough power, while simultaneously not burning your house down.
What the fuck is that and why?
Theres no way whatever the fuck that is provides any benefits over a normal cable.
Only for true black's IIRC.
No, they only understand the traditional SDR type of signal. Though I'm sure in some obscure corner of Japan, someone at some point probably developed a prototype CRT with some kind of rudimentary HDR.
Here's the dirty little secret with HDR on windows:
A lot of games run in a limited-RGB container, but most GPUs and PCs default to full-RGB. The result is a washed out video.
Any of you fags that play on a LG OLED, try changing your TV RGB to limited while keeping the GPU on full. You will see an IMMEDIATE pop in colors and erasure of the washed out image. Especially noticeable in Sony games like Uncharted 4 which I've been playing lately.
But it isn't a perfect solution either because this workaround limits HDR brightness and I think crushes blacks slightly. But the image quality is so much better.
Yes, it's a meme for girls.
The government creates a problem, "omg! it looks so bland and colorless!" on purpose. Then women go get excited for when they bring in the solution.
If you can slice off the head of the satanic snake that broke the colors in the first place, then you can stop the whoring out also to gimick solutions like Camera shake, bloom, depth of field, controller only games on PC, and more. (They worship a whore woman).
incel moment
Go drown in the water as a pig, demon. Baptism saves demons too.
The only time you ever want HDR is on an OLED or a MiniLED or something like that where pixel brightness matters. HDR otherwise makes so little difference, it's hard to notice that you even have it on.
It's a fucking disaster. Using HDR on W11 is still a massive glitchy inconvenience. Also there are no proper HDR displays out today and for the foreseeable future (at least 6 years).
Yes
Games do look better but its just so fucking bright. I don't understand how people can use those 1000+ nit brightness panels. I find myself switching back to SDR with brightness set to 180 nits so my eyes don't get strained in the dark.
Just because the panel is capable of 1000 nits doesn't mean you should increase the brightness of the image across the board, in fact that's detrimental to the HDR experience. What you want is a normally bright image, with what's called "diffuse" or "paper" white around 100-150 nits, and then reserve those extreme 1000+ nit values for displaying strong, typically fairly small light sources. That way you get the greatest contrast which is what makes it realistic.
I tried to turn on HDR on my monitor once, got literal pain in my eyes
since im viewing this in sdr and right looks better, then yes hdr is a meme since that can clearly be done without it...
HDR is one of those things that people think does nothing because their monitor doesn't support it, like FPS above 60.
The industry set HDR back literal decades by certifying trash displays that can't do it properly.
I don't blame anyone who thinks it's a meme, most people buy displays that don't get bright enough, use HDR and assume it's a meme.
They should have set HDR minimum spec to like 800 nits instead of 400.
How are miniLED monitors?
It's only meme for those who don't have a good monitor for HDR (like mine). Still, you can notice the difference in games that support it, as for example Elden Ring, after playing a lot with HDR On when turning it off you immediately notice that the image is now less rich and interesting.
But most people don't even have monitors capable of that, and even the ones that do are low quality shit like my gigabyte m27q.
I think the best way to describe HDR to someone who has no idea is to use an audio analogy.
SDR displays are like if when you listen to a song, every single note is the same volume, there are no parts louder or softer than the others.
An HDR display is like listening to a song normally, with some parts barely audible and then suddenly the guitar riffs in and blows your ears off.
Yes. Just get a plasma - they have extremely high contrast by default so they don't need it.
I have a HDR capable TV and HDR either looks like shit or it doesn't look different from SDR at all.
I even have a PSVR2 which has HDR enabled by default and it doesn't look much different to SDR honestly. Whenever someone goes "whoa the lights are BLINDING!!!" I have to wonder if they're retarded or I'm unaffected by brightness.
Garbage TV, garbage content, garbage eyes or a bit of all three. Proper HDR makes a big difference, with highlights looking really muted and compressed in SDR. Switching to HDR is like lifting a veil, the sense of clarity increases and everything just opens up.
unless your display's panel is capable of showing HDR, that pic in the OP won't really show what HDR looks like. And even if your display does have HDR, most people are too retarded to calibrate their colors for fidelity. For me HDR makes a huge difference, it accents things like fire so that they pop more, and it makes darks stand look richer. Not to mention the depth of color gradient you get with HDR, it's just beautiful.
Here's a screenshot I just took, with in-game HDR turned on. When viewed in SDR outside of game, it looks completely washed out and gray, way worse than SDR. But in game, this looks lush as hell
indeed pic in OP shows what GOOD sdr looks like since its not an hdr image etc
devs are just lazy and make their games bland as shit then do an auto hdr pass
>tfw 24" 1080p/60Hz VA monitor
Strange how the "HDR" on that supposed example still looks miraculously better on my SDR screen. Almost as though it was made with a specific conclusion already in mind as an advertisement point.
How "HDR is a meme" retards still exist in 2023? HDR has been the standard on TVs made after 2015.
You'd be surprised by the number of people who don't own a TV these days, who watch everything on their monitor, laptop or tablet.
Yes it is, you using this image is retarded no matter what your stance is tho
It's a relatively pointless thing that's only relatively useful for flashbanging your viewer with even more contrast than necessary.
I have the MAG274, the HDR works pretty good, auto HDR kinda blows out the whiteness in steam overlay and makes it super contrastey but that's it, otherwise even with it's relatively low nits it looks pretty good, text is enhanced a lot and any lighting looks much better.
Whoa. My SDR monitor can display that HDR image.