Originally, you had computers that could run software, if you wanted to run a certain type of software, you needed the right operating system, needed to install dependencies (Software that the software itself needed to run) and installed them on a physical computer.
Later, someone thought it would be easier if you were to use a virtual machine, basically emulating a computer on your computer (For example, simulating Linux computer on your Windows machine) and servers with enough power could handle multiple of these virtual machines. You could run a couple of these computers on one machine and didn’t need the physical hardware for each of them seperately.
If your software is small enough, however, the virtual machine added a layer of overhead to it, that it didn’t really make sense to emulate a whole operating system to run it. **Docker** (or containers more genrerally) are a way to basically create the minimal package for running your software. You will for example deliver the smallest possible linux distribution, bundle it with all the needed software (you might need say a database and ways to handle it, software for that would be included in that bundle) and add everything that you need to run your software and put it into one neat package. It is basically all you need, but nothing more.
This works greate if you have a small bit of software, but if you have larger applications that get more complex, that might not be enough. Maybe you need to have multiple instances of your software running, maybe somewhere one of your containers has failed, maybe there is a surge in incoming traffic etc. To not have to manually handle all of that, you need orchestrating tools like **Kubernetes**. If you see a Docker Container as this small bundle that can be run anywhere you already have docker installed, Kubernetes is basically the overseer for these little containers in any larger, multi-system network of interdependent containers. While each container knows “I am x, my task is y”, Kubernetes knows how many of x it needs, how x is working together with z and what to do if you currently need 10x the amount of y performed.
**CI/CD** isn’t technical in that sense, but more of a managerial style. In the past (and often also today) you had big releases for software, there might be a new version every couple of months or something like that. With CI/CD, you’re basically saying that you are constantly updating and deploying the software, having changes be delivered all the time, not just at certain times.
This has advantages as in you will be able to respond quicker, but it also has drawbacks, as the number of updates increases, meaning more administrative work and less time for testing and the greater possibility of shipping something that shouldn’t be shipped.
Latest Answers