Why is it that when you see a computer screen in a movie or on tv it doesn’t do that distorted thing that always happens when you see a computer screen in a video shot with a commercial video camera?

1.54K views

Why is it that when you see a computer screen in a movie or on tv it doesn’t do that distorted thing that always happens when you see a computer screen in a video shot with a commercial video camera?

In: Technology

4 Answers

Anonymous 0 Comments

Assuming you’re referring to “old” TVs and computer monitors…They have specialists with equipment that can play video on that screen at the same frame rate as the camera recording it. The issue is that the camera is picking up the vertical scan lines of the CRT monitor which are scanning at a different rate than the camera’s sensor. More info:

https://en.wikipedia.org/wiki/24p

Anonymous 0 Comments

It’s actually a really simple answer. The screen in the movie has nothing on it, and instead is a mini green screen. They then add in the graphics during the post production process. It eliminates the effect that you described and also eliminates the problem of timing what is going on the screen with what the actors are doing.

Anonymous 0 Comments

Most screens in movies and TV shows are faked with CGI.

They just replace the real screen with an image of whatever. Screen flickering is part of the reason, but it also solves issues with screen glare, the screen appearing dim, etc. It’s just easier to put in fake screen in post-production than to deal with a real screen image.

Anonymous 0 Comments

LGR made a video on how he records screens https://youtu.be/aJKV7fTJRIc

Flickering is removed by adjusting the shutter speed to match the display. Can also be done by editing the final video.

Moire pattern can be removed by having the display slightly out of focus, low pass filter or by just blurring it.

Colors are also adjusted in video editor, reflections removed with polarizing filter and well placed lights.