Hacker News new | past | comments | ask | show | jobs | submit login

Unfortunately, the flicker is essential for the excellent motion quality CRTs are renowned for. If the image on the screen stays constant while you eyes are moving, the image formed on your retina is blurred. Blurbusters has a good explanation:

https://blurbusters.com/faq/oled-motion-blur/

CRT phosphors light up extremely brightly when the electron beam hits them, then exponentially decay. Non-phosphor-based display technologies can attempt to emulate this by strobing a backlight or lighting the pixel for only a fraction of the frame time, but none can match this exponential decay characteristic of a genuine phosphor. I'd argue that the phosphor decay is the most important aspect of the CRT look, more so than any static image quality artifacts.

There is such a thing as a laser-powered phosphor display, which uses moving mirrors to scan lasers over the phosphors instead of an electron beam, but AFAIK this is only available as modules intended for building large outdoor displays:

https://en.wikipedia.org/wiki/Laser-powered_phosphor_display




But why would the flicker be considered "excellent motion quality"?

In real life, there's no flicker. Motion blur is part of real life. Filmmakers use the 180-degree shutter rule as a default to intentionally capture the amount of motion blur that feels natural.

I can understand why the CRT would reduce the motion blur, in the same way that when I super-dim an LED lamp at night and wave my hand, I see a strobe effect instead of smooth motion, because the LED is actually flickering on and off.

But I don't understand why this would ever be desirable. I view it as a defect of dimmed LED lights at night, and I view it as an undesirable quality of CRT's. I don't understand why anyone would call that "excellent motion quality" as opposed to "undesirable strobe effect".

Or for another analogy, it's like how in war and action scenes in films they'll occasionally switch to a 90-degree shutter (or something less than 180) to reduce the motion blur to give a kind of hyper-real sensation. It's effective when used judiciously for a few shots, but you'd never want to watch a whole movie like that.


Sample-and-hold causes smearing when your eyes track an image that is moving across the screen. That doesn't happen in the real world: if you follow an object with your eyes it is seen sharply.

With strobing, moving objects still remain sharp when tracked.


You're correct, but sadly most games and movies are made with low frame rates. Even 120fps is low compared to what you need for truly realistic motion. Flicker is a workaround to mitigate this problem. The ideal solution would be 1000fps or higher on a sample-and-hold display.


> Flicker is a workaround to mitigate this problem.

Isn't motion blur the best workaround to mitigate this problem?

As long as we're dealing with low frame rates, the motion blur in movies looks entirely natural. The lack of motion blur in a flicker situation looks extremely unnatural.

Which is why a lot of 3D games intentionally try to simulate motion blur.

And even if you're emulating an old 2D game designed for CRT's, I don't see why you'd prefer flicker over sample-and-hold. The link you provided explains how sample-and-hold "causes the frame to be blurred across your retinas" -- but this seems entirely desirable to me, since that's what happens with real objects in normal light. We expect motion blur. Real objects don't strobe/flicker.

(I mean, I can get you might want flicker for historical CRT authenticity, but I don't see how it could be a desirable property of displays generally.)


>Isn't motion blur the best workaround to mitigate this problem?

Motion blur in real life reacts to eye movement. When you watch a smoothly moving object, your eye accurately tracks it ("smooth pursuit") so that the image of that object is stationary on your retina, eliminating motion blur. If there are multiple objects moving in different directions you can only track one of them. You can choose where you want the motion blur just by focusing your attention. If you bake the motion blur into the video you loose this ability.


I guess it just comes down to aesthetic preference then.

If there's motion blur on something I'm tracking in smooth pursuit, it doesn't seem particularly objectionable. (I guess I also wonder how accurate the eye's smooth pursuit is -- especially with fast objects in video games, surely it's only approximate and therefore always somewhat blurry anyways? And even if you're tracking an object's movement perfectly, it can be still be blurry as the video game character's arms move, its legs shift, its torso rotates, etc.)

Whereas if there's a flicker/strobe effect, that feels far more objectionable to me.

At the end of the day, my eyes are used to motion blur so a little bit extra on an object my eye is tracking doesn't seem like a big deal -- it still feels natural. Whereas strobe/flicker seems like a huge deal -- extremely unnatural, jumpy and jittery.


You should be able to emulate close to CRT beam scanout + phosphor decay given high enough refresh rates.

Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.

You essentially draw the frame over and over with progressive darkening of individual scan lines to emulate phosphor decay.

In practice, you'd want to emulate the full beam scanout and not even wait for full input frames in order to reduce input lag.

Mr. Blurbuster himself has been pitching this idea for awhile, as part of the software stack needed once we have 960+ Hz displays to finally get CRT level motion clarity. For example:

https://github.com/libretro/RetroArch/issues/6984


> Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.

Many retro signals are 240p60 rather than 480i60. Nearly everything before the Playstation era.


I assume the problem here is that the resulting perceived image would be quite dark.

You'd need a screen that had a maximum brightness 10x more than normal, or something to that effect.


Is there actually a fundamental physical limit in modern (O)LED displays not being able to emulate that “flicker”, or is merely that all established display driver boards are unable to do it because it isn’t a mainstream requirement? If so, it would still be much cheaper to make an FPGA-powered board that drives a modern panel to “simulate” (in quotes because it may not be simulating, instead merely avoiding to compensate for by avoiding the artificial persistence) the flicker than bootstrapping a modern CRT supply chain?


The reason why this is a difficult problem is that physically emulating the flicker requires emulating the beam and phosphor decay, which necessitates a far higher refresh rate than just the input refresh rate. You'd need cutting-edge extremely high refresh rate monitors. The best monitor I found runs at 500hz, but pushing the limits like that usually means concessions in other departments. Maybe you could do it with that one.


My LG has something like that, OLED motion pro. I believe it displays blank frames given the panel runs at higher than 24fps. Medium is noticeably darker but oleds have plenty of brightness for my viewing space and it makes slow pans look much nicer. High is even darker but adds noticeable flicker to my eyes


72Hz is already a huge improvement in flicker from 60Hz though, and certainly maintains excellent motion quality.


But the refresh rate needs to match the frame rate to get the best motion quality. If you display the same frames multiple times you'll get ghost images trailing the motion. Lots of games are locked to lower frame rates, and there's barely any 72fps video.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: