Why is 2160p video called 4K?

253 views

Why is 2160p video called 4K?

In: 4205

13 Answers

Anonymous 0 Comments

The real answer is because 4K sounds better than 2160p.

HD tv and video was out and 1080 was pretty much in the vernacular. Now they had this better HD picture but how do you market that? Call it something flashy that can stick. 4k sticks. It also happens to be about 4x more pixels than the other HD format that was out.

Nobody would have bought the new tv’s if they just said 2160 on them or HD+, so they had to come out with a better way to sell and get mass adoption.

Anonymous 0 Comments

Others have explained part of this, but part of it has to do with context and history.

1280x720p is the original “HD” resolution, while 1920x1080p is “full HD”. 1080p ended up becoming very popular in television, while other resolutions faded.

Because of how scaling works, it is best to quadruple resolution rather than increase it in smaller increments – it means that older content will look normal on a higher resolution screen.

So the next step up was 3840×2160 (4k) which is exactly 4x the resolution of 1080p.

So the reason that “4K” has less than 4,000 horizontal pixels (3840×2160) is because it was quadruple that of the previous popular TV resolution – 1920×1080.

Computer monitors had a similar change, where 720 was popular, and then 2560×1440 became popular.

Anonymous 0 Comments

4k is vanilla big screen 16:9 cinema aspect ratio. 4096×2160. While you can get true 4k TVs, they are super rare and most TVs use more common blue ray UHD standard in 16:10 with 3840×2160. It works better in most rooms and the small black bars on the top and bottom of the screen won’t bother anyone. Especially since we are used way bigger size cuts in okd 4:3 TVs. For old 4:3 TVs. When film studios began marketing the term 4k, home entertainment industry had to put it on their UHD devices too so people understand quality will be the same, just in a slighty different ratio. (with a few pixel black bars)

Many people get this wrong when buying a TV and think bigger is better. But truth is that the viewing distance to size ratio is the most important thing after deciding to go for UHD.
A 65″ UHD TV will look worse than a 43″ UHD when sitting to close to the TV. However sitting to far from a 43″ TV will make it hard to spot details you’d be seeing with a bigger 65″.

Anonymous 0 Comments

Marketing. Resolutions are typically named after the vertical pixel count, which is the lower number. The jump from 480p (SD) to 720p (HD) was HUGE visually, so huge that small screens like phones and mobile game consoles are still made with 720p screens. AND the numbers of the terminology did look roughly double. However, that’s not quite how it works. You have to multiply the horizontal and vertical pixels. 480p (in 16:9 ratio which wasn’t common at the time) is 848 x 480, or 407,040 pixels. 720 is 1280 x 720 in a standard 16:9 widescreen, or 921,600 pixels.

The jump from 720p to 1080p (FHD) came pretty quickly, and while it wasn’t as big visually, it was still definitely noticeable. It was also still over double the number of pixels. 1080p is 1920 x 1080 in 16:9, or 2,073,600 pixels. The numbers only looked about 400 more again in name, but importantly, it was the baseline for a long time.

Blu-ray in around 2006 allowed for full HD images. Video games struggled to hit 1080p often for that era (PS3/XB360) but PCs could do it, work monitors and TVs often had 1080p panels and are still popular, and it was standard in games by PS4/XBONE in 2013. The PS4 Pro and Xbox One X pushed 4k gaming a bit in 2016, but those were half-gen upgrades, the PS4 Pro didn’t fully render it natively, and that’s still at least a DECADE of 1080p being standard without even having access to anything new. DVDs in 480p were only king in the early 00s, for reference. 720p didn’t have physical media pushing it that took off like DVDs or Blu-ray.

1440p (QHD) started to be a thing for monitors as it typically does first, but wasn’t catching on for TVs. Like at all. 720p had streaming to help it sell budget TVs, 1440p, not so much. It’s STILL not available on most streaming services, and 1080p actually looks worse on a 1440p screen due to video scaling. And like 720p, it had no physical video media or console video games to boost it.

1440p is 2560 x 1440 in 16:9, or 3,686,400 pixels. This is 1.77 times the pixels of 1080p, not ~2.25 times like the previous upgrades. But more importantly, it didn’t SOUND much bigger either from the terminology. The consumer sees it and thinks “what, only 400 more pixels again?”

Visually, I think going from 1080p to 1440p takes you about as far as going from 1080p to 4k, unless the screen is very big of course. Particularly on a computer monitor, you’ll likely not notice the difference between 1440p and 4k even in video games. The only thing you’d see is more aliasing maybe. But it wasn’t really enough for video consumers, or non-PC gamers. Even then, video cards were starting to plateau a bit, until recently it’s been hard to get a PC to run anything new at 1440p with a decent frame rate.

Anyway, 4k (UHD) is 3840 x 2160 in 16:9, or a whopping 8,294,40 pixels. 4x 1080p, and 2.25x 1440p. Normally it would be called 2160p, the vertical pixel count. But for marketing purposes, they decided 3840 (the horizontal pixel count) was close enough to 4000, and 4000 is ~4x 1080p, so they changed it to sound more accurate to what it actually is. Which is even more important, because #1 it sounds like a new technology. To most consumers they know standard definition, HD, and now 4k. And #2 because visually (for the average person and screen size) it’s not all that different than 1080p, and even less different than 1440p, so they needed it to sound more impressive. That, combined with UHD Blu-ray, PS5/XBSeries, and lowering costs of TVs have made 4k a smashing success.

Retroactively, 1440p is sometimes called “2k” now, even though 2560 is further from 2k than 3840 is from 4k. But it is more accurate on the sense that it’s around double 1080p and half of 4k.

Anonymous 0 Comments

It was a sales tool. 4k sounds 4x better than 1080p. And it is when you multiply the numbers.
3840 * 2160 = 8,294,400 pixles
1920 * 1080 = 2,073,600

If you told joe idiot consumer 2160p they would think it’s only twice the resolution.

Edit: math layout for readability.

Anonymous 0 Comments

Marketing assholes called it 4k because switching from the number of vertical pixels to the number of horizontal pixels yields a bigger number.

Anonymous 0 Comments

Its mostly for marketing reasons because most people would think that 2160p was double the resolution of 1080p when it is in fact 4x the resolution. By calling it 4k, which is the width res (4096 / 3840 depending on the standard used), instead of sticking with the height res (2160) it now “sounds” like it’s 4x the res of 1080 to a typical consumer.

Anonymous 0 Comments

the question has been answered, but part of this is also the mess of compromises that were made to standardize what the new HD “wide screen” tv would be. For… reasons… 1.85:1, already a standard in cinema, was rejected in favor of 1.78:1 (aka 16×9). IIRC this partly had to do with some japanese vendors getting ahead of the game in the early 90s before north america and europe were thinking about HD.

Interestingly, this also means some intermediate aspect ratios were introduced that would be a compromise between them. 14/9 was a thing. I even have a ground glass for an older film camera with these markings. it was a sort of safety format that could be cropped on 4:3 tvs without too much issue, and shown in 16/9 with minimal side bars. Interestingly, the star trek TNG reruns on BBCA crop the 4:3 masters down to this 14/9 aspect ratio to give a little more wide screen to the broadcast while minimally chopping off the tops and bottoms (that are somewhat into the safe zones anyway).

But yeah, TV 4k being 3.8k is basically a direct result of doing 1920×1080 instead of 2048×1080, its quadrupling the pixels. This is just one of many things in the business we’re beholden to because of poor planning or making a weird compromise. Hell it still hurts my head that the DCI cinemascope standard is 2.39:1 instead of just 2.40. Or, ya know, support 2.40 AND 2.35 given that it also supports 1.85…

Anonymous 0 Comments

u/pseudopad added an important detail. When HD and Blu-ray first started TVs were updating from 480 horizontal scan lines to 720i, 720p, 1080i, and 1080p. The i and p indicated interlaced or progressive scans – i would update every other line, and p would refresh the whole screen. 1080p quickly became the standard.

Now that i and p are forgotten relics, marketing stepped in to rebrand 2160p as 4k, with mild confusion to consumers.

Anonymous 0 Comments

They switched from using vertical resolution to horizontal resolution, in order to make the numbers look bigger. The jump from 1080p to 4k is actually less than the jump from SD to HD, so they needed another way to get people to upgrade.