Why do 4K videos on 1080p monitor look better than 1080p videos on 1080p monitor? The monitor displays the same amount of pixels in both cases, doesn’t it?

2.37K views

Why do 4K videos on 1080p monitor look better than 1080p videos on 1080p monitor? The monitor displays the same amount of pixels in both cases, doesn’t it?

In: 12

60 Answers

Anonymous 0 Comments

1. The video can be down-sampled, which results in a better image quality.
2. If you are using virtually any streaming site, the quality of a video is quite a bit lower than the resolution would suggest. So a low quality 4k video ends up looking better than a low quality 1080p video on youtube.

Anonymous 0 Comments

4K videos are often of a larger file size or streaming bitrate, and so will have more information and appear less compressed when viewed at 1080p.

The improvement isnt in the number of pixels, but the amount of information. Who cares if 4 pixels are lumped together in a block in 4K if it gets converted to 1 pixel in 1080p, the changes made by compression will hardly be noticable. If you lump 4 pixels together in 1080p you might start noticing it.

Anonymous 0 Comments

4K videos are often of a larger file size or streaming bitrate, and so will have more information and appear less compressed when viewed at 1080p.

The improvement isnt in the number of pixels, but the amount of information. Who cares if 4 pixels are lumped together in a block in 4K if it gets converted to 1 pixel in 1080p, the changes made by compression will hardly be noticable. If you lump 4 pixels together in 1080p you might start noticing it.

Anonymous 0 Comments

4K videos are often of a larger file size or streaming bitrate, and so will have more information and appear less compressed when viewed at 1080p.

The improvement isnt in the number of pixels, but the amount of information. Who cares if 4 pixels are lumped together in a block in 4K if it gets converted to 1 pixel in 1080p, the changes made by compression will hardly be noticable. If you lump 4 pixels together in 1080p you might start noticing it.

Anonymous 0 Comments

In addition to the points other commenters made about 2160p video usually being distributed at a higher bitrate, most (if not all) digital video you’ve ever seen (web video, DVDs, Blu-rays, UHD Blu-rays, …) uses a process called [YCbCr 4:2:0 chroma subsampling](https://en.wikipedia.org/wiki/Chroma_subsampling). The idea is that because the human eye is more sensitive to changes in brightness than changes in color tone, only the brightness portion of the image is stored at full resolution. Each 2×2 block of brightness samples shares a color tone, so while the brightness information is 2160p, the color information is only 1080p. If you’re watching 1080p video, its color information is stored at 540p.

Anonymous 0 Comments

In addition to the points other commenters made about 2160p video usually being distributed at a higher bitrate, most (if not all) digital video you’ve ever seen (web video, DVDs, Blu-rays, UHD Blu-rays, …) uses a process called [YCbCr 4:2:0 chroma subsampling](https://en.wikipedia.org/wiki/Chroma_subsampling). The idea is that because the human eye is more sensitive to changes in brightness than changes in color tone, only the brightness portion of the image is stored at full resolution. Each 2×2 block of brightness samples shares a color tone, so while the brightness information is 2160p, the color information is only 1080p. If you’re watching 1080p video, its color information is stored at 540p.

Anonymous 0 Comments

In addition to the points other commenters made about 2160p video usually being distributed at a higher bitrate, most (if not all) digital video you’ve ever seen (web video, DVDs, Blu-rays, UHD Blu-rays, …) uses a process called [YCbCr 4:2:0 chroma subsampling](https://en.wikipedia.org/wiki/Chroma_subsampling). The idea is that because the human eye is more sensitive to changes in brightness than changes in color tone, only the brightness portion of the image is stored at full resolution. Each 2×2 block of brightness samples shares a color tone, so while the brightness information is 2160p, the color information is only 1080p. If you’re watching 1080p video, its color information is stored at 540p.

Anonymous 0 Comments

There are a number of factors involved in this.

Yes, there may be compression artefacts which are different depending on how Youtube serves a video stream, but there’s more to it.

When a camera is described as 1920×1080 or 3840×2160, that typically refers to the number of pixels on the imager chip and is the theoretical maximum resolution that the camera’s capable of. The reality is that you never get that.

Lens quality (by which I mean sharpness, for this explanation) does vary. Lenses which were produced for the professional (broadcast) cameras in the 1920 x 1080 resolution range are not as sharp as today’s UHD / 4K lenses. Depending upon iris / aperture setting, even a lot of the expensive zoom lenses from the major manufacturers struggled to resolve the full resolution across the entire image, with the edges of the frame typically being softer than the centre.

The way in which the imager sensors is constructed is also a bit counterintuitive. On the face of it, you might think that the sensor is designed to maximise its possible image sharpness. But that isn’t the case. There’s an issue called aliasing, where a sensor which is receiving images with detail beyond which it can resolve will give optical errors. A good pictorial example is here:

https://matthews.sites.wfu.edu/misc/DigPhotog/alias/artifact.jpg

To prevent this, all modern imagers have an anti-aliasing filter (sometimes called an optical low-pass filter or OLPF) in front of them. This is a precision made, ever-so-slightly blurry piece of glass, and it’s there to filter out detail which the sensor can’t resolve correctly.

Due to the way they’re made, they tend to knock down the very sharpest detail which the sensor IS capable of resolving a little as well as the stuff they should be filtering out.

Detailed explanation here: https://petapixel.com/what-is-a-low-pass-filter/

I work with TV broadcast cameras (as an engineer for a specialist camera company) for a living, and can absolutely say that a down-scaled 3840×2160 picture, displayed on a 1080p professional monitor often looks considerably sharper than what you’d get out of a good 1080p camera and lens. This is uncompressed 3Gbs/s video with no MPEG / H26something / MJPEG processing.

The UHD cameras all have optical low-pass filtering, but of course that only softens out-of-resolvable detail at the sharpness that a UHD sensor would find troublesome. It has no effect on the level of detail that a 1080 sensor would see. Typically as well, a UHD-spec lens will, as you’d expect, be sharper across the whole field of view than an older HD lens.

The electronic down-scaling that the manufacturers use certainly has less of an effect on sharp detail loss (it’s the electronic equivalent of the optical filter that a 1080 sensor would have), and doesn’t do as much collateral damage to the detail in the image.

I think this is the main reason behind why some UHD images look as crisp as they do on a 1080p screen – that most 1080 camera / lens combinations don’t quite push the full 1080 resolution out, in the real world.

Anonymous 0 Comments

There are a number of factors involved in this.

Yes, there may be compression artefacts which are different depending on how Youtube serves a video stream, but there’s more to it.

When a camera is described as 1920×1080 or 3840×2160, that typically refers to the number of pixels on the imager chip and is the theoretical maximum resolution that the camera’s capable of. The reality is that you never get that.

Lens quality (by which I mean sharpness, for this explanation) does vary. Lenses which were produced for the professional (broadcast) cameras in the 1920 x 1080 resolution range are not as sharp as today’s UHD / 4K lenses. Depending upon iris / aperture setting, even a lot of the expensive zoom lenses from the major manufacturers struggled to resolve the full resolution across the entire image, with the edges of the frame typically being softer than the centre.

The way in which the imager sensors is constructed is also a bit counterintuitive. On the face of it, you might think that the sensor is designed to maximise its possible image sharpness. But that isn’t the case. There’s an issue called aliasing, where a sensor which is receiving images with detail beyond which it can resolve will give optical errors. A good pictorial example is here:

https://matthews.sites.wfu.edu/misc/DigPhotog/alias/artifact.jpg

To prevent this, all modern imagers have an anti-aliasing filter (sometimes called an optical low-pass filter or OLPF) in front of them. This is a precision made, ever-so-slightly blurry piece of glass, and it’s there to filter out detail which the sensor can’t resolve correctly.

Due to the way they’re made, they tend to knock down the very sharpest detail which the sensor IS capable of resolving a little as well as the stuff they should be filtering out.

Detailed explanation here: https://petapixel.com/what-is-a-low-pass-filter/

I work with TV broadcast cameras (as an engineer for a specialist camera company) for a living, and can absolutely say that a down-scaled 3840×2160 picture, displayed on a 1080p professional monitor often looks considerably sharper than what you’d get out of a good 1080p camera and lens. This is uncompressed 3Gbs/s video with no MPEG / H26something / MJPEG processing.

The UHD cameras all have optical low-pass filtering, but of course that only softens out-of-resolvable detail at the sharpness that a UHD sensor would find troublesome. It has no effect on the level of detail that a 1080 sensor would see. Typically as well, a UHD-spec lens will, as you’d expect, be sharper across the whole field of view than an older HD lens.

The electronic down-scaling that the manufacturers use certainly has less of an effect on sharp detail loss (it’s the electronic equivalent of the optical filter that a 1080 sensor would have), and doesn’t do as much collateral damage to the detail in the image.

I think this is the main reason behind why some UHD images look as crisp as they do on a 1080p screen – that most 1080 camera / lens combinations don’t quite push the full 1080 resolution out, in the real world.

Anonymous 0 Comments

There are a number of factors involved in this.

Yes, there may be compression artefacts which are different depending on how Youtube serves a video stream, but there’s more to it.

When a camera is described as 1920×1080 or 3840×2160, that typically refers to the number of pixels on the imager chip and is the theoretical maximum resolution that the camera’s capable of. The reality is that you never get that.

Lens quality (by which I mean sharpness, for this explanation) does vary. Lenses which were produced for the professional (broadcast) cameras in the 1920 x 1080 resolution range are not as sharp as today’s UHD / 4K lenses. Depending upon iris / aperture setting, even a lot of the expensive zoom lenses from the major manufacturers struggled to resolve the full resolution across the entire image, with the edges of the frame typically being softer than the centre.

The way in which the imager sensors is constructed is also a bit counterintuitive. On the face of it, you might think that the sensor is designed to maximise its possible image sharpness. But that isn’t the case. There’s an issue called aliasing, where a sensor which is receiving images with detail beyond which it can resolve will give optical errors. A good pictorial example is here:

https://matthews.sites.wfu.edu/misc/DigPhotog/alias/artifact.jpg

To prevent this, all modern imagers have an anti-aliasing filter (sometimes called an optical low-pass filter or OLPF) in front of them. This is a precision made, ever-so-slightly blurry piece of glass, and it’s there to filter out detail which the sensor can’t resolve correctly.

Due to the way they’re made, they tend to knock down the very sharpest detail which the sensor IS capable of resolving a little as well as the stuff they should be filtering out.

Detailed explanation here: https://petapixel.com/what-is-a-low-pass-filter/

I work with TV broadcast cameras (as an engineer for a specialist camera company) for a living, and can absolutely say that a down-scaled 3840×2160 picture, displayed on a 1080p professional monitor often looks considerably sharper than what you’d get out of a good 1080p camera and lens. This is uncompressed 3Gbs/s video with no MPEG / H26something / MJPEG processing.

The UHD cameras all have optical low-pass filtering, but of course that only softens out-of-resolvable detail at the sharpness that a UHD sensor would find troublesome. It has no effect on the level of detail that a 1080 sensor would see. Typically as well, a UHD-spec lens will, as you’d expect, be sharper across the whole field of view than an older HD lens.

The electronic down-scaling that the manufacturers use certainly has less of an effect on sharp detail loss (it’s the electronic equivalent of the optical filter that a 1080 sensor would have), and doesn’t do as much collateral damage to the detail in the image.

I think this is the main reason behind why some UHD images look as crisp as they do on a 1080p screen – that most 1080 camera / lens combinations don’t quite push the full 1080 resolution out, in the real world.