Why are modern televisions not designed for 24 frames-per-second when 99% percent of movies are in 24 fps?

91 viewsOtherTechnology

Almost all movies, except for experimental High frame rate films, are in 24 fps. However…

– Modern TVs are designed around 30 or 60 fps. Neither is divisible with 24.

– OLED TVs are so fast that 24 fps content have stutter

– LED TVs have blur

– Plasma’s no longer exist

– 3:2 pulldown, which causes judder because there is an uneven showing of frames.

So what gives? Why aren’t TV designed around the dominant 24 fps? 24 fps has always been dominant for movies, so it’s not like it came out of nowhere. I don’t get it.

In: Technology

7 Answers

Anonymous 0 Comments

Because television was broadcast at 29.97fps

A home TV wasn’t intended to emulate the ideal cinema experience it was intended to display TV signals amd was tied to the frequency of AC power.

60Hz (and so 29.97 fps ) f9r the US, 50Hz and 24.97 fps for Europe.

The tiny offset from 30 and 25 is to reduce static that increased after the colour switch in the 60s

Anonymous 0 Comments

Modern TVs are fully capable of 24fps. If the media player supports it, you can enable frame matching which matches the framerate sent to the tv to match the framerate of the source material.
If you watch a movie, you’ll see it at its original framerate. No stutter issues.

Anonymous 0 Comments

Because the electrical power in the US is 60hz, 30 was used for the TV standard as the minimum frame rate that looked smooth and also could easily keep the transmitter and receivers in sync using the frequency of the power lines.

In other parts of the world that use 50hz for power lines, they used 25 fps for their TV.

Anonymous 0 Comments

Some 120 hz TVs can display 24 fps without pulldown.

TV broadcast has always 30 frames / 60 fields a second because the power frequency has always been 60 cycles a second and in the vacuum days tube days it was too expensive to build a clock into every TV set to have it some other frame rate.

Anonymous 0 Comments

Other issue I don’t see anyone mentioning: video game consoles. Modern TVs are not for movies only so there is no reason to design them with any particular framerate in mind.

Anonymous 0 Comments

It is because of how the Television Standard was originally created which predates modern TVs. You have to go back to the first television, there was an amplitude modulated signal fed into that electron beam that set the intensity of the output. In order make the electron beam cover the entire display instead of a single point, they used electromagnets to deflect the electron beam into an grid pattern (column by column, one row at a time). The amplitude modulated signal and the electron beam deflection had to be perfectly timed and in sync with each other. This electron beam deflection was based on the frequency of the electric grid through things like analog frequency multipliers. This is because accurate enough clocks were impractical at the time and synchronization would be a challenge meanwhile the electric grid was the same frequency everywhere.

In the old days, the technology to store the analog signal for movies simply didn’t exist so it simply wasn’t an issue that things didn’t match up. All content was generated live, or if storage for later broadcast was absolutely necessary they would have a film camera pointed at a television set (where the frequency mismatch didn’t really cause huge issues), develop the film, then run it through a projector with a television camera that sends it out live again.

When the technology improved enough that the two standards started running into issues with one another, both were too set in their ways to change.

Anonymous 0 Comments

HDMI supports 24p mode and many TVs can use it. You can check by plugging a computer into them and seeing what display frequencies are available. Usually it’ll support 60, 50, 30, 25, 24.

Whether the media player you are using is smart enough to switch to 24p is another matter.