alright so there are a few crucial things here.
one: monitors *generally have better pixel response times, meaning that things dont get that little blurry streak when they move around.
two: tv’s generally have a bit more latency, you wont notice in watching videos or casual gaming, but there is a tiny delay between moving your mouse, and it moving.. which.. well personally i find infuriating but to each their own.
In practical terms you can plug a TV into a computer just fine, but there are drawbacks:
1. Low framerate. Nowadays 120+ Hz monitors are becoming standard. TVs rarely do above 60Hz right now. The difference from 60 to 120/144Hz is extremely jarring (especially if you have dual monitors so you can see them side by side).
2. Input delay. TVs like to have an *opinion* about the image they display to make it more “cinematic” or whatever, including stuff like AI now. A TV can have as much as 0.1s delay between it receiving an image and displaying it, and it’s a crapshoot if it lets you disable stuff to speed that up. Such a delay can cause motion sickness and feeling like you’re “drunk driving” fast paced games. A dedicated gaming monitor is literally 100 times faster there.
3. Fuckhuge. A TV is for watching from a couch. A 27″ 1440p/4K monitor is much better for viewing up close both from comfort and pixel density (image sharpness) perspective.
The difference is whether there is a tuner or not. A tuner allows the device to get over the air broadcasts. TVs also come with features like a remote control and speakers. A monitor may not have speakers or a remote and a monitor requires that you attach something to it to receive programming. A TV can be used as a monitor.
Not all TVs will look as good as a good monitor. Some TVs will.
For good-looking computer text on a TV you need support for something called “chroma 4:4:4”. Better TVs will support it but you might need to specifically enable it, while cheaper ones might not support it at all.
Some TVs do a lot of processing on the image which causes input lag and maybe does odd things to the colours. Enabling “game mode” or “PC mode” usually helps with this.
You *can* use a TV, I use a 43″ TV at 4K resolution (same as a grid of 4 monitors of 21″ 1080p each but no bezels) and very much recommend it but
– you may find dot crawl and/or strange shimmer artifacts, although these can often (IME) be fixed by tweaking the refresh rate (60Hz good, 30Hz bad) and some of the image settings on the screen (various motion blur and interpolation and other image processing settings etc)
– a TV doesn’t typically understand the sleep and wake signals that a PC sends down the HDMI, so you may have to turn it on and off with the remote (and if so, typically set it not to sleep automatically etc)
I then add an indoor aerial and sometimes watch broadcast sports etc on it over the weekend
Two things essentially:
1. Monitors don’t have TV stuff, such as tuners, sometimes speakers, image enhancements, and tons of inputs for the many things you often hook up to a TV. Basically monitors aren’t meant to be the screen sitting in your living room and/or hooked up to media. Most cheap monitors don’t even have to look good – they’re typically washed out and have wonky lines. But what they’re good at is making computer stuff look good. A lot of decently-priced monitors have high refresh rates and make sure everything is pixel-perfect as the computer intends to display them.
2. TVs only need to be good at displaying things for you to watch at your leisure. That means typical 60Hz refresh rates, not so accurate color but vibrant colors, and image processing stuff like soap opera mode and sharpness. Even game modes aren’t standard, and even then the quality of them can vary. Often times, to make a TV behave like a monitor, you’d have to turn on game mode and turn every image enhancement setting to off.
Therefore, in normal home/office use, you can use a TV as a monitor, and a monitor as a screen for whatever media you can choose to hook it up to. It is when you do more serious computer stuff and more serious TV stuff that the differences start to really matter.
Typically, monitors are displays like computer monitors. They usually don’t have rf tuners or speakers. Monitors don’t crop the image at all and display all pixels fathfully.
Televisions have rf tuners and speakers. Many times they can be used like monitors, but sometimes they crop off a bit of the picture (underscan).
There may also be model specific features like hdr, scan rates and refresh times. Many computer monitors don’t have those features. Tvs usually have those features enabled since they are optimized for television use.
TVs are made to be watched from across the room, and look pretty, not accurate. They will enhance all sorts of things to make a movie or a game look good.
Monitors are the opposite. They’re made to be used at a very short range, minimize eye strain, and to be precise, not pretty. They’re made to be used for hours, and to be trusted on color accuracy for people who do graphic work.
At short range, a TV looks worse than a monitor and is way harder on the eyes.
Get a monitor for your desk, get a TV for your living room, always.
edit : Btw, if you’re wondering, it’s the exact same thing with audio between Hifi speakers (= TVs) and monitor (“studio”) speakers (= monitor screens).
Latest Answers