How are resolution values counted for HD, full HD etc?

191 views

Hey,
HD is 16:9 *80 (1280:720), full HD is 16:9 * 120 (1920:1080).
The question is for what those values 80 or 120 stands for? Tried to figure it on my own, but no luck.

In: 5

5 Answers

Anonymous 0 Comments

You’ll also find that 16:9*160 = 2560:1440. This is double 1280:720.

4k is 16:9*240 = 3840:2160 which is double 1920:1080.

Why does it scale like this? Because we like scaling by factors of two. But you would say, that multiplier 80, 120, 160, 240 doesn’t scale by 2. It scales by about 1.33-1.5.

The area or number of pixels does scale by roughly 2. Area is one side multiplied by the other. An increase in both sides by 1.5x times means the area scales by 1.5*1.5=2.25, roughly 2x.

1.33x to 1.5x is close to the value of square root 2 = 1.41. sqrt(2)*sqrt(2) = 2.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

The 80 and 120 are just incidental not chosen, you need to start from the beginning

In the beginning there was NTSC (sorry PAL, you don’t matter here). NTSC was the transmission standard for color TV and gave you an image that was 640 pixels wide and 480 tall (4:3 ratio). The original selection of how many rows/columns is arbitrary, a number had to be chosen and it was chosen to be even and would impact everything to come after.

Then we wished to move to “HD” and handle both standard TV(4:3) and movie formats (21:9) better so they settled on 16:9 ratio. The first was 854:480 but they eventually settled on twice as wide as normal giving you 1280 wide (640×2) and 720 tall. This ensured that normal TV programs still fit, they were just scaled up by 1.5 with black bars on the side, and wide screen movies could show as wide screen with small black bars at the top and bottom.

Full HD boosted the count by 50% in each direction so you ended up with 1920 (640×3) by 1080 giving a bit over 2x the pixel count in total

Then Quad HD comes along and just doubles everything for 4x the pixel count so you get 3840 (640×6) x 2160

Every weird multiplier you find is actually derived from the arbitrary numbers chosen for NTSC back in the 1950s

Anonymous 0 Comments

You’re looking at it backwards. The 80 and 120 values you’ve pointed out are not causes, but effects. “[HDTV](https://en.wikipedia.org/wiki/High-definition_television)” defines a set of acceptable formats including 720p, 1080i, and 1080p, and a widescreen ratio (the relationship between the image’s height and width) of 16:9. 80 and 120 have no meaning beyond the simple mathematical relationship between the aspect ratio and the specified resolution.

Anonymous 0 Comments

Old tv/movies were 4:3

Modern movies are usually around 2.35:1 to 2.40:1 (~22:9)

Tv makers made 16:9 (16×9) as an in-between so you could display both easily.

Modern tv shows adopted the new, wider format.

To digitize 4:3 film, they chose 480p.

To make tv in 16×9 using 480p for 4:3, you naturally get both 720p and 1080p when you do some math.

HQ is 480p
HD is 720p
FHD is 1080p
QHD is 1440p
4K UHD is 2160p
8K UHD is 4320p.

You’ll find a lot of old YouTube videos (especially music videos) that have HD or even HQ in the title, as YouTube used to only be 320p.