ELI5- why do screens/lights flicker in videos taken by phone or even yt vids, but not in movies and professional productions?


ELI5- why do screens/lights flicker in videos taken by phone or even yt vids, but not in movies and professional productions?

In: 674

Professionals use synchronized cameras, or simply replace the screen displays in post-production with CGI. Professional cameras have lots of adjustments, like longer exposure times, that can give cinematographers more options to capture displays.

Without getting into why the imperfections happen in the first place, professional productions have a post-production step where imperfections are handled. Your phone videos often do not.

Because they change the setting on their camera to compensate.

Screens are just strobe lights that’s are so fast you can’t see them turn off. So the flashing screen on a camera is because you hit the timing just right. A refresh rate for most screens is 60 per second, and cameras are 30 frames per second.

TV and computer screens flicker in videos because they *roll* at the same rate as the screen you are watching on. If its professionally done then it us a simulated screen.

The light flickers at a certain speed, cameras with manual shutters can change so that the flicker doesn’t show. Your phone doesn’t have this.

The rate at which a camera captures images is likely not synced to the refresh rate of a screen or fluorescent light (which also flickers rapidly.)

This is to answer the first part of your question. Screens flicker in video taken by phone or any other video capturing device for one reason only. The frame rate of the video recording does not match the hertz rate of the location that is being shot. I battle this daily with the video team I work with as we shoot in different countries regularly. Most of the younger generation might not know this as most videos these days is consumed online which will play any frame rate and anything progressive or interlaced all the same, but back in the old days, each country had a standard when broadcasting video.

There are several broadcast systems across the world but 2 of the main ones are systems called PAL and NTSC. These are systems based on frame rates that are derived from the cycles of the electricity, aka hertz. For example, in the US, which generally has the supply voltage 120v has a cycle/hertz of 60. This means all lights, screens and anything else that has cycles will flicker at that rate. NTSC is generally considered to be 30fps or 29.97fps if you use drop frame. What is 60 divided by 2? 30. This is the reason NTSC is at the frame rate which means if you are shooting native NTSC, or 29.97 (or 24.97fps for that matter) you could shoot anywhere in the US and there is a good bet if you keep your shutter speed native to the defaults you will not suffer any flickering.

PAL on the other hand (think Europe or some places in Asia) works in locations that have 240v supply voltage. Also 240v in these locations are at 50hz. What is 50 divided by 2? Yes, 25. Guess what the frame rate is of PAL? 25fps. This ensures if you are shooting in a PAL aka 240v supply location, you will also not suffer flickering in most cases. Oddly enough, there is a reason for these systems. Of course this is not a constant, you will have the odd flicker of halogens, also fluorescent tubing and in I most cases, changing the shutter speed slightly will fix this but might not always be the best choice when it comes to low light situations.

General rule of thumb is, shoot the frame rate of whatever the hertz/supply voltage is in the country you are in and you will probably be ok.

Phones tend to shoot in a non fixed variable frame rates. Not all but a majority do so you will find flickering can happen in some recordings but not in others.

But for films, well films have budget, budget pays for very talented people to find ways to sync refresh rates of screens to match the frame rate of the cameras and the everything is perfect.

Old screens like CRT monitors refresh the images line by line from one end of the screen to the other. Because the exposure time of your eyes is longer than the time it takes for the screen to refresh, you see it as a single image.

Video camera shutters are usually set to be much faster, while also capturing 24-60± frames per second. They only take in enough light during the time the shutter is open per frame to get part of the screen exposed. And because it captures many frames a second, it gives that flickering effect due to each subsequent individual frame not matching the previous every time.

If you use a longer shutter speed, it will reduce this effect by letting more light in per frame. You can also synchronize the frame rate of the camera with the monitor’s refresh rate.

Because of shutter speed of some cameras, helicopter blades appear to look like they stay still .

In the old days we had a sync box which would send a timing signal to the camera and any TVs to make sure that the sync rate of the camera matched the TVs. A mismatched sync rate causes flicker.

Moving pictures on an old TV are basically a series of still images strung together (like an updated version of flip cards) The movie camera is taking a series of still images and stringing them together. If the camera and the screen are not working together, you get flickers.

[Beside post production. Tom scott does a good explanation of the effect.](https://www.youtube.com/watch?v=uzP8FFKpwQ0)

First off, lights connected through an outlet aren’t always on, the flicker to the frequency of your grid, so 50-60 hz depending on where you live, some cameras record in 25fps, with a 50hz system every frame will have light as 50 is a multiple of 25, on the other hand a camera recording at 25fps and a grid on 60hz will not match and this you will see flickering of lights when you record. That is a similar issue recording refresh rates on screens and cameras again. On most cameras you can change your fps to 25 (PAL option on iPhone) to accommodate for the flickering.

Production companies will makes sure that everything matches plus they have post production and what not

More expensive lightbulbs can be made to not flicker, dispite the alternating current in your wall outlet.

Lights aren’t “turned on” without any pause. They all flicker a bit because of the electricity supply, but this flickering is too fast for our eyes to notice.

Every camera also does not record without any pause, it takes an image, then after some time takes another.

When the camera takes images faster than the time it takes for a screen to “shine then pause/rest then shine again”, then you end up seeing these on recordings as an annoying flicker; originally the reason cameras take so many images so fast is because the film gets very “smooth” and motion is captured in much more detail, but these annoying flickers from screens is the drawkback of this feature. Theoretically, there are 2 ways to avoid this: 1 you record slower (there are various ways to do this) so the flickering vanishes or 2 you give shorter pause /rest time for the screens. In practice obviously we go with scenario 1, because if you tamper with the electricity supply of the electronics by yourself, you could break it and even hurt someone.

Some light bulbs don’t necessarily flicker, because instead of rapidly turning off for a moment then on again, instead it faintly gets dimmer then brighter (you can do this with smart placements of diodes in the circuit to smooth out the electric supply to a relatively almost constant current; I believe more household light bulbs are made this way if you buy new generation, but I’m not sure) and the camera may not necessarily pick up this faint back-and-forth switches.

This is just roughly the theory to why you see it and how you could in principle overcome this. I don’t actually work in filming, so the current methods and solutions are definitely more smarter, but I don’t believe an ELI5 answer should reference anything too technical.

Explaining like your five? Here we go…

Funny enough, all man-made light flickers, or turns off and on really really fast. Our eyes see at about 24 frames, or pictures, in one second. That’s why when things move faster than that, it looks blurry, kind of like why helicopter blades or the hubcaps of cars blur in motion.

Now, ever turn on your fan, and it goes faster and faster, then it looks like it’s moving backwards? That’s because the fan is turning so fast, your eyes only see the fan 24 times a second. So when your eye sees it the next time, the fans blades are in a different position, and it ‘appears’ to move backwards!

Now, with certain cameras, you can adjust something called “shutter speed”. When you take a picture, the shutter speed is basically how long the camera ‘blinks’ in a second.

So blink right now. That’s like taking a picture.

Now blink really fast! That’s what a video camera is doing. It’s taking pictures and putting them together. That’s the basics of a video.

And you can tell both cameras to ‘blink slower’ or ‘blink faster’.

Now, TV’s and monitors and even florescent light ‘blinks’ at a certain speed. If you tell your camera the blink at the same speed, your camera will show you what your eyes see…a steady light, or a spinning fan, or a solid glowing computer screen.

If you tell your camera to blink at a DIFFERENT speed, your camera can show you all kinds of things—a blinking computer monitor, or fluorescent light being red instead of white. You can even make a flying helicopter look like a floating without turning the blades!

Go ahead and try it out!

Today, CGI is often used. I worked with a Mac II (circa 1989) that was going to be filmed and the film-makers brought a special video card that allowed the video generated to be synced with the camera. It filmed rocked solid.

in addition to all the great answers, you might be interested in this modern use of the synchronized refresh rate / recording — [https://www.reddit.com/r/nextfuckinglevel/comments/odmy22/different_channels_different_ads/](https://www.reddit.com/r/nextfuckinglevel/comments/odmy22/different_channels_different_ads/)

several different commercials are played at the same time, on the same screen. it’s recorded in a high frame rate, and broadcast in a low framerate. so different broadcasters can choose a different frame offset.

What he means by synched cameras is that the film rate matches the refresh rate of the screen being filmed, if you match your fps to the refresh rate you don’t see the “flicker”

What he means by synched cameras is that the film rate matches the refresh rate of the screen being filmed, if you match your fps to the refresh rate you don’t see the “flicker”