Yes you should consider using the highest fidelity video option available to you. The biggest retards in retro gaming are people like who think using component on a CRT removes the soul or something. It's just better.
The idea with better cables is to increase sharpness and prevent ringing. If there's noise it's probably from the source, not something picked up by the cable.
Unshielded cables can pick up interference from other sources and cause noise, try using cheap ps2 component cables vs shielded ones and it’s really easy to notice.
A modern flatscreen has less than 1ms of latency. But let's humour you and assume 3ms of latency: might have had a point with latency if you were playing modern games on a 240hz monitor, but a frame persistence of 16.7ms or more is significantly higher than 3ms, so any impact on reaction time is extremely questionable. >superb motion quality
Lol. CRT's have afterglow up the arse, which I actually like because the main reason to use a CRT is the unique combination of its quirks and its ability to do composite justice.
A modern flatscreen has less than 1ms of latency. But let's humour you and assume 3ms of latency: might have had a point with latency if you were playing modern games on a 240hz monitor, but a frame persistence of 16.7ms or more is significantly higher than 3ms, so any impact on reaction time is extremely questionable. >superb motion quality
Lol. CRT's have afterglow up the arse, which I actually like because the main reason to use a CRT is the unique combination of its quirks and its ability to do composite justice.
>frame persistence of 16.7ms or more
This only happens with traditional sample-and-hold LCDs. Modern gaming LCDs support strobing. But because they strobe the whole screen at once, they have to wait until all the data is transferred to the screen before starting the strobe. CRTs strobe each individual line and avoid this latency penalty.
This isn't such a big deal at high frame rates, and emulators support software black frame insertion to get equivalent latency to doubled frame rate, but with original hardware it's noticeable.
>This only happens with traditional sample-and-hold LCDs
By frame persistence I'm not talking about your ghosting or whatever but the framerate of the game; 3ms of latency is nowhere near enough to be perceptible on a 60fps game let alone 30.
Persistence causes sample-and-hold blur. Modern gaming LCDs strobe avoid this, but they have to wait for the whole frame to be transferred first (frame time the minus blanking interval, so about 15ms for most 60Hz video). This has nothing to do with ghosting, which is an addition form of blur specific to LCDs, and which is very small on modern gaming LCDs.
If the resolution of the display is divisible by 480 then there's no reason Dreamcast should look bad on a flatscreen thru component (and especially not thru VGA)
The reason is good motion quality with low latency. CRTs give you low persistence comparable to modern gaming LCDs, but without the 15ms latency penalty of whole-screen strobing. CRTs strobe each line individually as soon as it's transmitted.
2 weeks ago
Anonymous
Also, this latency penalty only applies when you have strobing enabled. If it's disabled the LCD updates line by line just like a CRT. But then you get sample-and-hold blur, so it's not a good solution. Using software black frame insertion instead of hardware strobing is a compromise solution that you might prefer.
The problem could theoretically be made insignificant by splitting the strobe into multiple narrow bands (scanning backlight), but I am not aware of any modern gaming LCD that does this.
1 week ago
Anonymous
Would a 120Hz screen with BFI have the same input latency than a 60Hz without BFI?
2 weeks ago
Anonymous
Are we using "persistence" differently? I'm saying that at 60fps you will not notice a 3ms delay
2 weeks ago
Anonymous
Persistence refers to the image being displayed for the entire frame time instead of briefly strobed. This is the main cause of blur for console games on LCDs. It's unrelated to any 3ms LCD response time delay.
Blurbusters has explanations:
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
CRT afterglow is so minimal compared to flat panel motion blur that its hardly even worth mentioning. Kind of like comparing a scraped knee to quadrupedal amputation.
If the resolution of the display is divisible by 480 then there's no reason Dreamcast should look bad on a flatscreen thru component (and especially not thru VGA)
The reason people use a CRT is partially response time sure, but aesthetically it looks MUCH better.
I have played dreamcast games through vga/component on modern displays and it looks horrible.
On a CRT with component or a VGA monitor though? Fucking PRISTINE.
And no, filters do not compare even on an oled (and I have an 83in oled)(You) >If the resolution of the display is divisible by 480 then there's no reason Dreamcast should look bad on a flatscreen thru component
I try to run everything on at least s video, i have a big crt so the artifacts from conposite are very noticeable, especially with the color red for some reason, one you upgrade everything looks smooth and pretty
It depends on your set up. If you’re using a modern or nice display and an upscaler, video noise becomes more and a more noticeable, and video cables of lower bandwidth such as composite start to look noticeably blurrier.
If you care about these things, look for shielded cables and as long as they’re actually shielded you should be good to go.
Be forewarned, it’s an expensive rabbit hole.
Is there any reason to use component/hdmi on DC over VGA? Having a console with VGA out completely blew my mind back then, and I’m glad that most tvs still have it
>premium cables
Arcades had the best RGB displays and graphics and every home console did their best to come close to that look and feel.
Inside all arcade machines was a rats nest of very cheap wires. The RGB wires were no different then the coin mech, marquee lights or speaker wires. It didnt matter. Power wires may be a thicker gauge but thats it. No cables were shielded, the RGB wires could be bundled with power wires and wrapped together in electrical tape at times.
Home consumers go retarded over good quality vs cheap component cables, S-video or vga cables. They feel if they pay more on cables that the image will be magically enhanced. Most premium cables are bullshit marketing memes. Like anything There is a low end of china shit with flaws and bad connectors that you should avoid, but that old walmart s-video cable you used on your dvd player back in 2003? Perfectly fine.
It makes more difference as you step up in resolution, and as the wires are closer together.
A unshielded JAMMA connector is fine for transmitting RGB at 240p, but try the same at 1600x1200 and you will get noticeable ghosting.
This was most noticed by Xbox 360 players buying cheap VGA cables instead of the OEM one.
not to mention vga cables. the shittiest pc with the cheapest possible vga cable had amazing video quality compared to consoles, just because it had the proper 5 conductors for the video signal.
Honestly, I've been to so many LANs and kickbacks where they had CRTs and all that matters to me is if we're using analog displays rather than digital for consoles with analog out support. I fucking hate input delay, I always have my KV-9PT50 in my trunk just in case these fools are doing it wrong.
When it comes to old school vidya, ALWAYS go with the most accessible CRT you can get. Doesn't matter if it's RF or component or VGA, old school vidya was designed for displays with no input delay.
There is an analog video cable that runs to my PC CRT monitor. Occasionally, the image will go wonky for a moment. I solve this by getting up and moving the cable a bit until it works again.
I do not plan on replacing this cable until it stops being easy to remedy with a quick jiggle.
I just got a composite cable for my PS3 and while PS1/PS2 games look better, PS3 games now run in a small square on my already square CRT. Did I do something wrong, or this is to be expected? The settings will only let me output to standard, no 480p.
If you're using good cables then you may as well use a flatscreen over a CRT, and if you're using a flatscreen then you may as well emulate
Yes you should consider using the highest fidelity video option available to you. The biggest retards in retro gaming are people like who think using component on a CRT removes the soul or something. It's just better.
RF gang, it must look at hard to see as possible. Text must be blurry.
Premium cables only made a difference for very high resolutions. Every VGA cable can handle 640x480.
The main reason to use a CRT is the unique combination of perfect latency + superb motion quality.
Huh?
The idea with better cables is to cut noise in the image
Which is more visible on bright solid colours
The idea with better cables is to increase sharpness and prevent ringing. If there's noise it's probably from the source, not something picked up by the cable.
Unshielded cables can pick up interference from other sources and cause noise, try using cheap ps2 component cables vs shielded ones and it’s really easy to notice.
You can’t reason with CRTards.
A modern flatscreen has less than 1ms of latency. But let's humour you and assume 3ms of latency: might have had a point with latency if you were playing modern games on a 240hz monitor, but a frame persistence of 16.7ms or more is significantly higher than 3ms, so any impact on reaction time is extremely questionable.
>superb motion quality
Lol. CRT's have afterglow up the arse, which I actually like because the main reason to use a CRT is the unique combination of its quirks and its ability to do composite justice.
>frame persistence of 16.7ms or more
This only happens with traditional sample-and-hold LCDs. Modern gaming LCDs support strobing. But because they strobe the whole screen at once, they have to wait until all the data is transferred to the screen before starting the strobe. CRTs strobe each individual line and avoid this latency penalty.
This isn't such a big deal at high frame rates, and emulators support software black frame insertion to get equivalent latency to doubled frame rate, but with original hardware it's noticeable.
>This only happens with traditional sample-and-hold LCDs
By frame persistence I'm not talking about your ghosting or whatever but the framerate of the game; 3ms of latency is nowhere near enough to be perceptible on a 60fps game let alone 30.
Persistence causes sample-and-hold blur. Modern gaming LCDs strobe avoid this, but they have to wait for the whole frame to be transferred first (frame time the minus blanking interval, so about 15ms for most 60Hz video). This has nothing to do with ghosting, which is an addition form of blur specific to LCDs, and which is very small on modern gaming LCDs.
The reason is good motion quality with low latency. CRTs give you low persistence comparable to modern gaming LCDs, but without the 15ms latency penalty of whole-screen strobing. CRTs strobe each line individually as soon as it's transmitted.
Also, this latency penalty only applies when you have strobing enabled. If it's disabled the LCD updates line by line just like a CRT. But then you get sample-and-hold blur, so it's not a good solution. Using software black frame insertion instead of hardware strobing is a compromise solution that you might prefer.
The problem could theoretically be made insignificant by splitting the strobe into multiple narrow bands (scanning backlight), but I am not aware of any modern gaming LCD that does this.
Would a 120Hz screen with BFI have the same input latency than a 60Hz without BFI?
Are we using "persistence" differently? I'm saying that at 60fps you will not notice a 3ms delay
Persistence refers to the image being displayed for the entire frame time instead of briefly strobed. This is the main cause of blur for console games on LCDs. It's unrelated to any 3ms LCD response time delay.
Blurbusters has explanations:
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
give it up
afterglow is not motion blur anon and it only affects on black
CRT afterglow is so minimal compared to flat panel motion blur that its hardly even worth mentioning. Kind of like comparing a scraped knee to quadrupedal amputation.
The reason people use a CRT is partially response time sure, but aesthetically it looks MUCH better.
I have played dreamcast games through vga/component on modern displays and it looks horrible.
On a CRT with component or a VGA monitor though? Fucking PRISTINE.
And no, filters do not compare even on an oled (and I have an 83in oled)
If the resolution of the display is divisible by 480 then there's no reason Dreamcast should look bad on a flatscreen thru component (and especially not thru VGA)
>
The reason people use a CRT is partially response time sure, but aesthetically it looks MUCH better.
I have played dreamcast games through vga/component on modern displays and it looks horrible.
On a CRT with component or a VGA monitor though? Fucking PRISTINE.
And no, filters do not compare even on an oled (and I have an 83in oled)(You)
>If the resolution of the display is divisible by 480 then there's no reason Dreamcast should look bad on a flatscreen thru component
>video shows 1080p displays with smoothing filters enabled
lol
TV's do not have nearest neighbor
modern screens won't just scale integer, why do you think people buy scalers
The flickering on that in 60hz would be uncomfortable to say the least. Just no
The flickering is essential for good motion quality. That's the tradeoff you make when you use low frame rates like 60fps.
I try to run everything on at least s video, i have a big crt so the artifacts from conposite are very noticeable, especially with the color red for some reason, one you upgrade everything looks smooth and pretty
pic unrelated?
The difference between VGA and composite isn't a matter of cable quality.
It depends on your set up. If you’re using a modern or nice display and an upscaler, video noise becomes more and a more noticeable, and video cables of lower bandwidth such as composite start to look noticeably blurrier.
If you care about these things, look for shielded cables and as long as they’re actually shielded you should be good to go.
Be forewarned, it’s an expensive rabbit hole.
Back in the day RGB scart cables were the first thing I'd buy for a new console.
Obviously
Is there any reason to use component/hdmi on DC over VGA? Having a console with VGA out completely blew my mind back then, and I’m glad that most tvs still have it
Yes, not everyone has a VGA monitor
Is VGA the best option for Dreamcast? I hear it's dodgy for some games and isn't "real" VGA?
Yes
I use hd retrovision component cables on a consumer CRT.
Here's component
Well, are you going to activate windows?
nope
>premium cables
Arcades had the best RGB displays and graphics and every home console did their best to come close to that look and feel.
Inside all arcade machines was a rats nest of very cheap wires. The RGB wires were no different then the coin mech, marquee lights or speaker wires. It didnt matter. Power wires may be a thicker gauge but thats it. No cables were shielded, the RGB wires could be bundled with power wires and wrapped together in electrical tape at times.
Home consumers go retarded over good quality vs cheap component cables, S-video or vga cables. They feel if they pay more on cables that the image will be magically enhanced. Most premium cables are bullshit marketing memes. Like anything There is a low end of china shit with flaws and bad connectors that you should avoid, but that old walmart s-video cable you used on your dvd player back in 2003? Perfectly fine.
It makes more difference as you step up in resolution, and as the wires are closer together.
A unshielded JAMMA connector is fine for transmitting RGB at 240p, but try the same at 1600x1200 and you will get noticeable ghosting.
This was most noticed by Xbox 360 players buying cheap VGA cables instead of the OEM one.
You shop at WALMART?!?!
not to mention vga cables. the shittiest pc with the cheapest possible vga cable had amazing video quality compared to consoles, just because it had the proper 5 conductors for the video signal.
>every home console did their best to come close to that look and feel
By not having component/SCART cables in the box? lol
no, composite or rf is all you need
Fine at 240p
Not at 480i
You need AT LEAST s-video.
Depends on your set
>he doesn't use premium cables yet considers himself a valid gamer
Lol. Lmao, even
There aren't any games that I know of on Dreamcast that exploited composite so you would definitely want the cleanest image possible.
My general rule of thumb is
4th generation or below: composite on a CRT
5th generation and above: flat screen with composite
Why's PS1 composite better on a flatscreen than a CRT?
Honestly, I've been to so many LANs and kickbacks where they had CRTs and all that matters to me is if we're using analog displays rather than digital for consoles with analog out support. I fucking hate input delay, I always have my KV-9PT50 in my trunk just in case these fools are doing it wrong.
My general rule of thumb is do whatever you think looks best
>I only had RF cables as a kid
It.... it wasn't THAT bad... was it?
no but 7 year old me knew that yellow jack on the fancy TV looked fuckin baller
When it comes to old school vidya, ALWAYS go with the most accessible CRT you can get. Doesn't matter if it's RF or component or VGA, old school vidya was designed for displays with no input delay.
There is an analog video cable that runs to my PC CRT monitor. Occasionally, the image will go wonky for a moment. I solve this by getting up and moving the cable a bit until it works again.
I do not plan on replacing this cable until it stops being easy to remedy with a quick jiggle.
Emulate is better than composite, vga, scart and make the game looks gorgeous through upscale.
For CRTs not really but LCD yeah.
I just got a composite cable for my PS3 and while PS1/PS2 games look better, PS3 games now run in a small square on my already square CRT. Did I do something wrong, or this is to be expected? The settings will only let me output to standard, no 480p.
only if they're gas filled