Why does 30 fps on CRTs look so much smoother than LCDs?

I hope you didn't pay for a degree my friend.

The reason is because CRTs have 60 fields per second when displaying 30 frames per second.

No.

Firstly, CRTs running interlaced content at 60hz do not display 30 frames per second. They display 60 half-frames per second. This is a critical distinction in a discussion about animation quality.

Secondly, interlacing is not an inherent part of CRT technology. Monitors and later consumer CRT TVs could display progressive scan/full frame content with no problems at all. Also, there's nothing stopping LCD or OLED from displaying interlaced content.

Furthermore those don't perfectly line up, creating that nice smooth look CRTs are known for.

The blending of images in an interlaced video actually decreases visual quality1, and any effect on temporal quality would be to lower it significantly with blurrier motion than you'd expect from full frame content.

The reason for the difference in motion quality between the technologies is due to CRT's continuous scan design. The phosphors only light up for so much time before naturally fading, and in the meantime the image is just continually being drawn. To put it simply, it's actually the image disappearing faster that makes the most impact on how we perceive the animation quality, rather than just having more frames delivered... and certainly not because of half resolution sequential images being blended every refresh.

Digital displays use sample & hold, which builds an image in software similarly to how a CRT draws its picture on phosphor, but then displays the full frame for the entire duration as dictated by the refresh rate (basically).


1: This example highlights how the greater the difference one frame to the next, the more distorted and blurry the image will look. While CRTs naturally handle this process better than sample & hold displays, it is still a problem.

/r/emulation Thread Parent