It’s a combination of having your expectations elevated by modern resolutions and the fact that screen sizes are generally just much bigger now.
You used to watch low resolution content on tiny screens and that was all you knew. Now you watch super high resolution content on very large, high resolution screens. The older content is no longer standing on its own, but being contrasted with these better images, especially when displayed on screens that play to the better quality content’s strength while highlighting the poor quality of the older content.
the best reason i can think of is that video looks best when it is the same or higher resolution than the screen you are watching it on. Lower res video looks better on a low res screen than it does on a higher resolution screen most of the time. (this depends on how the video is up-scaled) Also you were used to that video quality at that time, and now you are used to something higher resolution and the transition has been slow. As the years pass nothing changes much, and then you look back and see how far tech has come.
Capture your content at the highest quality possible. This determines so much when it comes to viewing the output on a screen.
Video equipment matters. Equipment back then could not capture data at as high a quality as devices today. Monitor/tv quality matters too, but playing an old video on a high-end device won’t give you high-end results.
As for you not noticing a problem back then, it’s like when I first got glasses. My brain knew that things at a distance were blurry, but I didn’t know it could be better so I didn’t think about it. Just like video yrs ago. It was technology and we knew it was as good as it could do for the time. Our brains accepted it and ignored most issues. Same happened with video games.
With glasses, things far away were more clear. That reset my expectations. Without them, I’m very aware of the blurriness and it bothers me. Same thing happens now if we watch home movies from the 90s. It’s brutal.
Latest Answers