Almost all movies, except for experimental High frame rate films, are in 24 fps. However…
– Modern TVs are designed around 30 or 60 fps. Neither is divisible with 24.
– OLED TVs are so fast that 24 fps content have stutter
– LED TVs have blur
– Plasma’s no longer exist
– 3:2 pulldown, which causes judder because there is an uneven showing of frames.
So what gives? Why aren’t TV designed around the dominant 24 fps? 24 fps has always been dominant for movies, so it’s not like it came out of nowhere. I don’t get it.
In: Technology
Because television was broadcast at 29.97fps
A home TV wasn’t intended to emulate the ideal cinema experience it was intended to display TV signals amd was tied to the frequency of AC power.
60Hz (and so 29.97 fps ) f9r the US, 50Hz and 24.97 fps for Europe.
The tiny offset from 30 and 25 is to reduce static that increased after the colour switch in the 60s
Because the electrical power in the US is 60hz, 30 was used for the TV standard as the minimum frame rate that looked smooth and also could easily keep the transmitter and receivers in sync using the frequency of the power lines.
In other parts of the world that use 50hz for power lines, they used 25 fps for their TV.
It is because of how the Television Standard was originally created which predates modern TVs. You have to go back to the first television, there was an amplitude modulated signal fed into that electron beam that set the intensity of the output. In order make the electron beam cover the entire display instead of a single point, they used electromagnets to deflect the electron beam into an grid pattern (column by column, one row at a time). The amplitude modulated signal and the electron beam deflection had to be perfectly timed and in sync with each other. This electron beam deflection was based on the frequency of the electric grid through things like analog frequency multipliers. This is because accurate enough clocks were impractical at the time and synchronization would be a challenge meanwhile the electric grid was the same frequency everywhere.
In the old days, the technology to store the analog signal for movies simply didn’t exist so it simply wasn’t an issue that things didn’t match up. All content was generated live, or if storage for later broadcast was absolutely necessary they would have a film camera pointed at a television set (where the frequency mismatch didn’t really cause huge issues), develop the film, then run it through a projector with a television camera that sends it out live again.
When the technology improved enough that the two standards started running into issues with one another, both were too set in their ways to change.
Latest Answers