Is it true that you need a 4K OLED TV to accurately emulate a CRT display or is it just a meme?
![]() Nothing Ever Happens Shirt $21.68 |
![]() UFOs Are A Psyop Shirt $21.68 |
![]() Nothing Ever Happens Shirt $21.68 |
Is it true that you need a 4K OLED TV to accurately emulate a CRT display or is it just a meme?
![]() Nothing Ever Happens Shirt $21.68 |
![]() UFOs Are A Psyop Shirt $21.68 |
![]() Nothing Ever Happens Shirt $21.68 |
if you like how it looks then you like how it looks
accuracy is a spook
This. I've been playing with shaders for almost 10 years now. The most you can get is:
>1. A picture that matches your current screen to a CRT you have and know well. If you change your screen, you'll have to redo it all over again.
>2. A picture that pleases you.
Going with option 2 is quicker, easier and more compatible among different hardwares. I could even ramble about the CRT paradigm, but that's probably unnecessary.
this
I have saved two presets. One is "blurrier" for games with lots of dithering and the other is the same thing but sharper.
there is no "accurate CRT display"
every CRT is slightly different. It's why PVMs are meme'd
You need a 4K display that has HDR capabilities to simulate a shadow mask even remotely well.
>there is no "accurate CRT display"
every CRT is slightly different. It's why PVMs are meme'd
Why are PVM's meme'd moron? Your statement makes no sense.
True, barring limited reliance on composite for blending dithering for faux translucency (which your eyes and brain are gonna blend to a degree anyway, regardless of display), the fact of the matter is that CRTs were thousands upon thousands of differing 'standards' which devs could never realistically target in any meaningful and specific manner.
The only time a dev could count on real consistency was with machines providing their own standardized screens, as in arcade cabinets and handheld games.
Go with what you think looks best, be that CRT (real or filter), PVM, or sharp pixels, as long as it's not some Super Eagle bullshit filter you can't be going very wrong.
You don't "need" it, but the more resolution, the better.
You need integer scaling for shaders and 2160p is exactly 9 times 240p.
You need an OLED with rolling scan like the Sony PVM-1741 for accurate CRT emulation
their the same image
You can't accurately emulate a CRT display, it's a different type of display. The best theoretical shader setup would still only be emulating a video of a CRT.
8k microled running at 500hz *MIGHT* be able to crudely approximate the experience of a low end CRT television
Wrong. 32K, quantum led, running at 6400Hz, had around 0.8% chance of replicating a broken shard of glass from the first prototype of a CRT.
Source: do you need one? lol go back, zoom-zoom.
You still can't emulate a CRT on a 4k OLED. It will always be a meme.
You can get close but fundamentally it won't work, the contrast and brightness would need to be a way, way higher range to accurately simulate what's actually happening on a CRT. Any modern panel will always lose brightness by using a filter, that's just physics.
Don't strive for CRT accuracy, strive for something that does a good job of optically anti-aliasing the image. It doesn't have to be accurate, it just has to look good.
I have very strong opinions on this subject based off the images like op's I see on my phone and memories of crt gaming from childhood
Eventually we'll be able to 3d print our own brand new PVMs...
did anyone actually test if that image is accurate?
it gets posted here a lot, but I doubt it is real
why wouldn't it be real? have you never seen a crt irl?
tbf it's more than 10 years since I saw one
but that image has more detail than the raw pixel the digital artist drew, which is very strange, anyone would doubt that
take the hand for example, where did that dark line separating the thumb came from?
you can blur, fake scanline, do whatever, the end product never will come out like that
>which is very strange, anyone would doubt that
Composite
>where did that dark line separating the thumb came from?
240p did that. And then the brighter part of the hand (to the right) blooms vertically
There is no "dark line" separating the thumb, that's simply an anomaly from the scanlines interpreting the raw pixel data as a gradient, so it naturally causes the two pixel lines to trail, with the space inbetween being left empty.
man it's crazy how old video game artists thought of that
limitations spur creativity
it also helps that most people who worked on them were also technology enthusiasts
They didn't always, because you can't always count on those kind of outcomes given the astronomical disparity in displays between end users.
The left in the OP is from Japanese twitter user @ruuupu1
He said he was using S-Video cables on a Trintron
>scanlines interpreting the raw pixel data as a gradient
Do you realize how stupid that sounds?
Scanlines do not interpret anything
Its likely either due to S-Video being used or a phosphor color range difference
breaks between scanlines you mean
>Do you realize how stupid that sounds?
>Scanlines do not interpret anything
"Interpret" was probably a poor choice, being that there's not internal intelligence deciding to display it as a gradient, but in effect, that is literally what it's doing. The data being fed into the device is not a 1 to 1 input/output, so the CRT display "interprets" the data, IE, portraying it on screen, you spastic moron.
>Its likely either due to S-Video being used or a phosphor color range difference
Oh, you mean, it's likely due to the fact that it's a fricking CRT and is exactly what I just said, and the "phosphor color range" is the basic fricking technology of a CRT works by default, you posturing homosexual?
>The data being fed into the device is not a 1 to 1 input/output, so the CRT display "interprets" the data, IE, portraying it on screen
The CRT doesnt interpret anything. CRT is a dumb device.
What you are talking about probably has to do with the dot pitch or resolution(TV lines) of the tube
Learn how raster scan logic works
>it's likely due to the fact that it's a fricking CRT
I never said that you idiot.
Modern screens use different color space from early 90s and 80s CRTs.
However modern-ish CRTs in the 2000s used same color space as we do now for SDR
You have to take into consideration the different PAR
Image on the right in the OP is an emulator screenshot which likely didnt adjust for PAR while the left shot is real hardware on CRT which outputs proper 8:7 PAR
crt-guest-advanced with mask 5 and some magic glow n shit
I use the ntsc version with shadow mask and massive smearing settings to make it lookin real good but you probably wanted to see a close up of the phosphors using an aperture grille setting
They both look like shit when you blow up the image like that
that's also true
right looks like there's poop coming out her anus
cope
Well, that's where it comes from, right? Are you the only one who doesn't know? Wow.
>das ass vs. dat ass (in minecraft)
filters will never be CRT
You don't need either to actually play the game
> Play games without a screen
Look how stupidly long her leg is
that picture is just horrifically ugly and poorly-drawn in general, i have no idea why i've seen spergs repost it severalt imes
Dat ass
You can already do it with a 8x8 overlat matrix on 4x multiplied 240p (960p) inside1080p
>help me!! help me!!
squish
That looks so cool
Source?
You can make your own masks and preview them using this:
https://bool.space/mister/ShadowMaskEditor.htm
Make sure the sprite is properly cropped and have fun.
Expected drawback, but on a real display you would use HDR
See the difference is Hulk looks good, crt or not. The CRT just makes him look even better.
better but now the image is dark
>better
lol wut?
there is nothing about that that looks like a CRT or improves the image
To get the best possible result? Yes. The higher the resolution, the more pixels the CRT shader has to work with, and thus the more faithful the end result will be, especially with more complex masks like slot or shadow masks. Aperture grille masks can still look really good at 1080p, but will look better still at 4K. Moreover, you'll want HDR to offset the loss of brightness from the scanline gaps and the mask. The best shader for this purpose is Sony Megatron, which looks insanely authentic on a high-end OLED display. You'll still have shit motion due to sample-and-hold blur, though, and there's no way to get rid of that without Black Frame Insertion, which will halve the brightness and make the image too dark if coupled with a CRT shader, even with HDR. So we're still a ways off from a perfect solution.
that pic is only used as an example because it shows a girl's naked butt cheeks, and the butt cheeks on the left make you hornier because they look more real
Girl on right needs to wipe her ass.
Look at your thumbnail, numbnuts.
It is not possible to emulate a superior device on an inferior one.
You need to have pixel elements small enough to simulate the physical mask, and then likely at least 4x that to simulate the color bleeding.
Or you could just sit at a proper distance and your eyes will do all the blurshit you want for free.
Color bleed is a meme
At some point in the future we will get 10000PPI or 20000PPI displays which will obsolete the usage of CRT shaders completely
>an image that shows how CRT TVs differ from sharp pixels well can't possibly be used by more than a single person! I'm sure they're changing the file name each time they post so they can be posting it many times without people knowing it! Do you have any idea of how FRICKING moronic you sound right now? You stupid frick.
I saved that image and a bunch of other ones from that CRT twitter account last year, i used her to make the thread because she's the only one i had that had a comparison with the raw image
left lets you see she's actually Asian, heh
>colors are more saturated
Cheating
what?
Right is RGB, left is NTSC
that's what happens when you use the correct colour encoding
>left is NTSC
*S-Video not composite
Most difference in color is because of phosphor color space differences anyways
lol what?
S-Video is still NTSC or PAL anon, RGB is different
Left looks like its RGB on a CRT
They don't have blue hair and big eyes either, moron.
>Left looks like its RGB on a CRT
Its S-Video
The source for the image is @ruuupu1 from twitter
Japanese twitter user
Ah okay
>They don't have blue hair and big eyes either, moron.
What is your point exactly?
She doesn't look Asian whatsoever.
Her name is Chadarnook and she's a Goddess
The only Japanese character in FF6 I can think of is Shadow
Traditionally most Japanese RPGs stuck to the roots of RPGs which is D&D so that meant Medieval European fantasy.
>S-Video is still NTSC or PAL anon
NTSC and PAL shouldnt be used for Composite anyways, its moronic.
RGB/Component can send a PAL and NTSC signal too in the sense that they send 480i60hz or 576i50hz
>RGB/Component
Component is still NTSC or PAL anon
RGB is not a connection
You are moronic
NTSC and PAL are specs not signals
In reality no video game system followed the NTSC spec verbatim(except maybe 3DO or some shit)
You cannot get PAL or NTSC RGB you idiot
stop talking about basic shit if you don't know what they are
PAL and NTSC are standards/specs not signals you moronic Black person
There is nothing special about them and games never adhered to them perfectly
Refresh rate was almost always off-spec compared to NTSC/PAL standard
Stop talking, you are full of shit
>STANDARDS
>USE MY NOMENCLATURE
They are literally frequencies used to encode colours, why do you constantly talk shit?
You are the TVL dumbass aren't you
You were the one that stated an RGB signal can be PAL/NTSC so you can frick off
RGB can display both 480i60 and 576i50 and so can component
Nobody cares about some color encoding shit
Games never ever followed either spec perfectly, they always differed on resolution or refresh rate.
Stop lumping up RF, Composite and S-Video into one category. They all are different. S-Video is closer to RGB than to Composite.
Also Component is converted from RGB.
Again, you are full of shit.
You probably are a fat neckbeard.
>Nobody cares about what NTSC and PAL mean
I accept your concession
Can it, fattie
>menu
>color
>warm <<< >>> cold
Every time I see this topic, I always remember that people discount input delay as an important aspect for what is optimal besides the pixel dithering.
>nobody can feel it
>I have a personal threshold
>who cares?
It's still crazy to me that people are only after the superficial aspect and will never be happy always making adjustments or experimenting with another methods of mimicking the CRT look.
That pixelated shitter!
>accurately emulate a CRT display
Thats not a thing with current day flat panel technology. Maybe wait another 50 years.
Spatially we are almost there with shaders like Mega/Cybertron especially
Temporally we need 1000hz and for the display to display 30K nits for a few nanoseconds(can be done with MicroLED or QDEL)
We are pretty close, definitely not 50 years
Unless you want to simulate CRT beams themselves which is pretty pointless
>3d print a brand new CRT television
>it works
>3D Print Vacuum tube
yeah I wasn't sure if the optical fiber would work as filament but there were no problems
oled is cool but last time i was looking at getting a new pc monitor i couldn't find small 4k ones only 1080p, my crt is 14inch and is a really good size i like it, by the time it dies and im looking at 8k i hope they do some small ones
How could it not be true? Are you moronic?