A bit of over-simplified history:
The earliest computers were millions of relays and literally required to have the circuits rewired for each “algorithm “. These computers would malfunction when literal bugs would get in the circuits. That’s where we get the term “bugs” for software glitches.
By the time we got to vacuum tube computers, we had punched card input. Where a hole would represent a closed switch or a “1” and no hole would represent an open switch or a “0”
These 1’s and zero’s formed the earliest computer language which came to be known as “machine language “
Obviously humans can’t read or understand machine language easily. So once we get to IC based electronic computers, we invent three letter mnemonics for all the common commands. Eg ADD, SUB (subtract), MOV (move), etc. This is known as assembly language and requires an assembler to translate to machine language that the computer can understand.
Once we got to this point, it was obvious that computers need to understand humans instead of humans learning to “write” machine code.
This started a whole host of high level languages that humans could easily understand and code. Different languages were developed to cater for different professional needs. Fortran became the language of choice for physicists and mathematicians, COBOL for banking, C for software developers, etc. There are mire specialized languages that you will barely hear about like ANSYS APDL for stress analysis professionals or MATLAB programming language for people that work with matrices.
Each of these languages has commands and tools that make it easy for professionals in that field to do their job.
But essentially all of these languages are then translated to machine language. Each language is designed to be most efficient for a particular task.
It’s like using a hammer drill vs impact wrench vs drill driver. All of them look similar and can be used interchangeably but each one is designed for a different professional.
I feel like the “conversion” part of the question hasn’t been answered in the top comments so:
Yes algorithms _can_ be converted from one language to the other, but in general _not easily_! Auto-converters generally do a very poor job, because all they can realistically do is the equivalent of translating a spoken language word by word, but to have a good translation you need context, cultural background etc.
Converting projects from one language to the other is something that’s regularly worked on inside companies, and people do allocate time and resources to it because, like some have already said, each language has its own advantages and disadvantages.
Think of it like drawing using different mediums. Whether you use crayons, pencil, charcoal, oil colors , water colors or anything, all will produce a painting, each will bring out some unique aspect of the final art.
Programming language have similar aspect, some are just artistic choice, and some aspects are better or worse in different languages. One language makes it easy to learn, but will be slow to run big programs, one might focus to be really efficient in memory usage, but not in size of code, some will be very primitive to allow the user to have full control over minute things.
Depending on what you want the final product to look like and where it would be used, you would choose some language, and of course there is personal preferences on the choice too
Tldr, different languages have different strengths and weakness
A programmer might prefer one language over another for many different reasons. They might like the syntax better, or maybe it runs more efficiently. Maybe it gives you more fine control over what’s happening on the system, or maybe it manages more stuff for you so you don’t have to think about it.
The biggest reason programmers prefer specific languages though is just plain inertia. “We’ve done it using this language before, so it’ll be easiest to use this language again.” It takes a lot of work to learn a new language, and it takes a lot of work to rewrite code in a new language, so there are some older languages which are still popular today even though they probably wouldn’t be if they were new, (cough cough Java cough cough) simply because there are so many code libraries already written for them and so many developers who already know the language. Programmers refer to this as a “mature ecosystem”.
Python, for example, is popular for data science and that’s largely because of the amazing data science libraries that are available only in Python.
Another example: Javascript is one of the most popular languages for web development and that’s because it’s the only scripting language that is understood by modern web browsers.
Because software developers care about more things than just “does it work”.
* What languages is the programmer experienced with?
* How easy was it to write the code?
* How efficiently does the code run?
* How easy is it to read?
* Does the program cause any unwanted side effects?
* How likely is it that that other programmers will be familiar with the language?
* How easy is it to find errors?
* How reliable or prone to errors is the program?
* How easy would it be to add more features to the code in future, or change particular behaviour, if the requirements change?
* How well does the program handle edge cases, failures, or unexpected inputs?
* How easy is it for the end user to run the program?
* Does the program depend on a specific environment? A particular browser or operating system? Particular hardware?
All of these and more are impacted by the choice of programming language.
Programming is no different to any other field. Pick any task or problem one might encounter in their job or daily life. I guarantee you there is more than one way to achieve the same result. Why do some farmers use tractors when they can move dirt with a shovel? Why would anyone drive across the country when flying is much faster? Why are there so many different flavours of ice cream? Why do nails exist when screws can do the same thing?
Algorithms matter, and sometimes a lot of research and development goes into creating the right algorithm for a specific problem.
_However_ most large software projects involve hundreds of algorithms interacting, or at least coexisting. Most of the time and effort of software engineering involves managing this complexity. The most important differences between languages are the tools and affordances for managing large scale complexity.
Other differences involve trading _portability_ (same code runs the same everywhere) and readability against performance.
Programming languages are just means to describe what you want the computer to do.
Some things are easier to describe in some language than in some other. That is why we have many.
Extra: it’s way easier to describe what kind of snow is falling from the sky in a language that has 34 words for 34 different kinds of snow than in a language that has just 3.
Latest Answers