What’s the objective difference between a TV and a monitor?

687 views

Other than the fact one has an aerial connection and a TV tuner built in

In: Technology

5 Answers

Anonymous 0 Comments

Nowadays, not a whole lot. You could use a modern TV as a computer monitor just fine. But there are still a few key differences.

Monitors designed for gaming will support higher refresh rates. That is, the number of times the screen draws, or refreshes, an image each second. Most TVs will only accept a 60Hz signal, while many monitors support up to 144Hz. Higher frame rates result in smoother motion, which looks better and can assist with fast paced competitive games, assuming your graphics card can keep up. A lot of these monitors also support variable refresh rates, so they can adjust it on the fly to match the frame rate your graphics card is putting out. This can eliminate stuttering if the frame rate is lower than what the monitor normally supports. They also tend to do less image processing than TVs, reducing input delay between pressing a button on your controller and that action happening on screen. Which again is important for competitive games.

Monitors meant for graphic design and video/photo editing work are designed to have greater colour accuracy than your typical TV. Which is important especially for work that will be shown in digital cinemas, or printed out.

You are viewing 1 out of 5 answers, click here to view all answers.