IR sensors are cheap and easy to manufacturer and do not require approval from the FCC. Since you’ll always be facing the TV, there’s no need for an RF remote, as the IR sensor will always be facing you. Also, since IR is mostly limited to line of sight, it’s highly unlikely that if your neighbor has the same make and model of TV, that their remote will be able to control your TV if they’re very close. Where as with RF, this could be possible, which would make the system more complicated as special encoding methods would be necessary.
It is cheap to make and have the advantage that is it line of sight so without adding any complex pair system is it limited to the room and will not control stuff in different room or apartments or even houses that would be the case it you just transmitted for example radio wave.
It had the drawback that it is line of sight so stuff like game consoles uses radio waves. But then you need to pair the device and the remote. That add user complexity and radio is a bit more expensive to use.
You use IR and not visible light so you can’t see the blinking like you can with many digital cameras.
Related, some early remote controls used high frequency (15,000-20,000hz+) sound instead of light. The researchers developing the technology were mostly old (enough) men who’s hearing had degraded to the point that they couldn’t hear the remotes. Their young secretaries and lab assistants on the other hand *could* hear the sound of the remotes and *hated* them. The rise of transistors and IR LEDs are what finally killed this version of the technology.
Edit: These [early remotes](https://www.youtube.com/watch?v=PlgSuaIHYsY) are also where the slang “clicker” for remote controls came from due to the fact that some made a distinctly audible click.
IR is cheap and energy-efficient.
Also, the waves attenuate very quickly and can’t penetrate through material, which is actually very beneficial for TVs.
If the waves could over-penetrate, you would have to worry about your neighbor changing your channel. Generally speaking, you’re only going to be using the remote in front of the actual TV, so LOS isn’t an issue.
Not only that, but the lack of interference from other TVs makes it easier to program the remote itself, because you don’t need as many complex ID mechanisms to make sure the right remote is interacting with the right TV.
You can use much more generic signals because of this, which makes everything cheaper and simpler.
The [top answer](https://www.reddit.com/r/explainlikeimfive/comments/brqi1n/eli5_why_do_remote_controls_use_infrared_light/eofqlgi/) already covered this a bit, but I want to expand upon one thing.
Some early remote controls, like the “Flashmatic”, worked by having a photosensitive receiver at each corner of the TV screen. You would control volume or channel, respectively, by shining a light on one of these.
The problem here may be pretty obvious: a stray sunbeam, a reflection off a watch/glass/mirror, or just some asshole with a flashlight can now flip through your channels or max your volume and there’s nothing you can do.
Because IR transceivers are easy and cheap to make. Longer wavelengths need increasingly long antennae. Shorter wavelengths are visible(which would be really annoying), and as they get shorter, it becomes harder to make components that work with those wavelengths.
We use longer wavelengths(microwaves) for wifi and other communications in the home with about the same range, because microwaves can go through walls. Near IR has the limitation that it can’t go through walls, but usually if you don’t have line-of-sight to the TV, you aren’t watching and don’t need to control it.
Latest Answers