Why are TV user interfaces so slow and glitchy?

221 views

I feel like TV app’s like Netflix (maybe the exception) / HBO / Hulu / AppleTV+ / et. al. are incredibly slow and pretty poorly designed. When I hit a button a few times on the remote to scroll it will lag, then jump suddenly across a few icons. Computer based apps don’t do this. What gives?

In: 5

6 Answers

Anonymous 0 Comments

Well it’s just like buying any other electronic.

I have an LG CX and it has never slowed down or shown age.

I have a Samsung low-end 4k as well and it is super slow. It just can’t keep up with modern software feature expansion.

Most of your cheaper or mid-range TVs will have a singular processing unit with a tiny bit of RAM.

This processing unit will be in-charge of apps, upscaling, background processing, transcoding, etc.

This is a lot of work for a little ARM processor. It usually takes a beefcake to muscle out good transcoding or upscaling. The app developers try to work within limits of assumed RAM use but they can only sacrifice but so much for low-end or older models.

Once you run out of RAM, usually the system will start using its slow internal storage as a mem swap which is about as effective in this case as Windows Ready Boost.

This all results in a slow TV.

Anonymous 0 Comments

Making a snappy, responsive, nice-looking app takes a lot of time, effort, and money. And, it requires a more powerful processor to run it without a lot of lag and jerkiness. All of this adds up to a TV that costs more. And since the market for TVs is extremely competitive, even shaving off $50 from the price of a TV can make a big difference in sales, so TV manufacturers are constantly looking for any way to shave pennies off of the cost of the TV.

Anonymous 0 Comments

The computer in your TV isn’t very powerful. You generally aren’t buying your TV based on its processor speed so they generally skimp as much as they can while still making it functional

Anonymous 0 Comments

You spend a very small percent of your time actually using the app interface. If you’re on your phone, you’re probably gonna scroll around and browse but odds are if you’ve gone to the bother of putting it on the actual TV you already know what you’re going to watch. The interface only has to be good enough to get you to that thing, and then you just watch the thing for hours, possibly being served sequential things to watch through a simple menu and just binging the served content. There are other parts of the experience that would bother you more (bad video quality/buffering/desync of audio) that get more money and attention.

Anonymous 0 Comments

Consumers don’t generally buy a particular “smart” TV for the quality of the apps. They typically don’t research it at all, paying more attention to picture size/quality and price. Many reviews barely even mention how well the smart apps run.

TV manufacturers know this. So they generally use the cheapest processors they can get away with, and it shows in their performance.

My advice is to buy a Roku or similar device and don’t use the built-in apps at all.

Anonymous 0 Comments

Combination of OS and application. You’re discussing non tech companies (exception Apple) making tech. Netflix depends on you living in their app, it’s where all their “money” is made so they’re incentivized more to create a first-class experience in hopes you’re not password sharing. All the others have a diverse portfolio of other products/services where they won’t dedicate the same attention (yet). Add things like Roku, Android/Google TV/Chromecast, Fire OS, and tvOS being fairly different from one another, companies are looking at at least 4 teams to dedicate their attention to on top of the broadcasting and filming costs. Add the cost of an American-based developer being around $99k conservatively, each team being at least four people, and you start to see why there’s a inconsistency in experiences. They’re looking for a cost effective denominator while attempting to create a consistent experience between all apps.