Assuming you’re referring to “old” TVs and computer monitors…They have specialists with equipment that can play video on that screen at the same frame rate as the camera recording it. The issue is that the camera is picking up the vertical scan lines of the CRT monitor which are scanning at a different rate than the camera’s sensor. More info:
https://en.wikipedia.org/wiki/24p
It’s actually a really simple answer. The screen in the movie has nothing on it, and instead is a mini green screen. They then add in the graphics during the post production process. It eliminates the effect that you described and also eliminates the problem of timing what is going on the screen with what the actors are doing.
Most screens in movies and TV shows are faked with CGI.
They just replace the real screen with an image of whatever. Screen flickering is part of the reason, but it also solves issues with screen glare, the screen appearing dim, etc. It’s just easier to put in fake screen in post-production than to deal with a real screen image.
LGR made a video on how he records screens https://youtu.be/aJKV7fTJRIc
Flickering is removed by adjusting the shutter speed to match the display. Can also be done by editing the final video.
Moire pattern can be removed by having the display slightly out of focus, low pass filter or by just blurring it.
Colors are also adjusted in video editor, reflections removed with polarizing filter and well placed lights.
Latest Answers