This is totally me just being naive, I don’t work in the software realm but do have an interest possibly one day, but if we have stuff that’s been able to run (seemingly) successfully for years, or maybe even decade, what maintenance needs to even be done on old programs? Is this simply for people discovering security vulnerabilities and patching them? Is there more to it than that?
In: Technology
A lot of software you use doesn’t exist in isolation. It connects to various other services on the Internet. If those services change their APIs in breaking ways, your software needs to be updated. They might do this for security reasons or others. Like for a trivial example, maybe you have to change a “share on Twitter” button to a “share on X” button.
You also might run into bugs that only get exposed when your app runs on a new operating system or hardware. Sometimes the result of a breaking change made by the os vendor / hardware manufacturer, but often the result of an incorrect assumption that is technically a bug, but never got the chance to get noticed.
For example, there used to be pretty much two standard resolutions for computer monitors: 640×480 or 800×600. After several years, 1024×768 monitors started coming out and a good number of windows applications broke because of buggy code that said “if the use’s resolution is not 800×600, pop up an error message telling them they need a higher resolution monitor to use this software”. Well, that was the result of bad programming, but it was also pervasive and made a good chunk of software unusable without upgrading (or deliberately dropping your monitor’s resolution).
Latest Answers