Windows was designed around binary distribution. Back in the 80s Microsoft were the pioneers of this business model: put some software on a CD, add a manual, box it, shrink wrap it and sell it for $$$! Windows still does this well — you can often find a program from 30 years ago that still works mostly OK.
Unix-like systems predate that model and are designed around source code distribution. You can download some source from way way back (I still use code I wrote in the early 90s), type `./configure && make && make install` and there’s a good chance it’ll work.
Neither is much good at the other. Binary distribution is often a PITA on unix systems, and building some big pile of source you got from somewhere is often impossible on Windows.
The upshot is that they suit different markets. Users generally prefer precompiled programs (obviously), but researchers and developers will mostly prefer a source-based system, especially if they are collaborating on something and want to swap code back and forth with colleagues.
There’s been an interesting shift in computing over the last 10 or 15 years with the absolute dominance of the interwebs. These days, the idea of buying shrink-wrapped software on a CD in a store seems crazy, and almost everything is downloaded, and often “compiled”, on the target machine. The thing that propelled Windows to dominance has become irrelevant, and the strengths of *nix, especially the better technical underpinnings and the lack of 40 years of binary compatibility baggage, seem much more appealing. There’s a reason (almost) no new platforms are based on Windows.
Latest Answers