You absolutely can use a television as a computer monitor.
Way back in the cathode ray tube days, TVs were distinguished from computer monitors insofar as that TVs could only accept a very specific resolution (480 lines, interlaced, 60Hz if you’re in North America) whereas PC monitors had multiple supported resolutions.
The only appreciable difference between a PC monitor and a TV these days is that a TV will likely have some rudimentary built-in speakers and some streaming software like Netflix built in. Neither of which is a detriment to just using the TV as a monitor.
TVs are not built to respond quickly, playing games on a TV is very painful because the input lag is so long
PC monitors are not built to look beautiful, watching films on a computer monitor isn’t as good because the colours and contrast don’t pop like on a TV, and it (hopefully) doesn’t have built in features like motion blur and filling in predicted frames for observed “higher” framerates
I’ve used a TV as a monitor – TV has less flexibility in terms of placement – the stand for a monitor is much more flexible. TV’s are usually wall mounted (requiring extra kit) or on their own feet, either way not needing to move much. Monitor stands/mount systems are usually better for fine adjustments to the screen position. Something to consider.
Tvs especially now tend to have a lot lot lot more post processing options (for better or worse) most tvs now have AI up scaling which on high end tvs can make lower resolution content (like dvds) look noticeably better.
They also have motion smoothing or what ever the specific brand calls it. Basically increasing the frame rate of something by generating additional frames to put in between real ones. This is pretty universally accepted to be absolutely horrific amongst people who actually care about how they watch things.
Tvs also tend to have a lot more input output options, and will come with (again for better or worse) a lot of smart tv apps.
Note that TVs often have some image improvement things (upscaling, color correction, …) that increase latency. They have these as linear tv or streaming media often has lower quality than the tv can display.
Therefore they generally have a “game mode” to reduce input lag when using a gaming console.
Computer screens will generally focus more on low input lag and less on image improvements as your graphics card will render the image input on the screen’s native resolution.
TVs are for viewing from a distance. If you sit too close to a TV it will look worse and you can see the individual color squares (pixels)
Monitors are for viewing at around an arms length. If you sit further away things may look too small or hard to see.
When you view a monitor from the correct distance it should take up more of your vision than when you view a TV from the correct distance.
Monitors are designed to go into power saver mode quickly without an input and turn on immediately with an input. Typically nobody cares about the image processing delays with a tv because they are playing a set video from start to finish and a half a second of lag before starting the stream is ok. Monitors are designed for rapid processing and switching tasks immediately as they are part of the interface to the computer. If those 2 features don’t matter to you then a tv is fine as a monitor.
Latest Answers