There doesn’t have to be. It’s largely that the computer field has been largely unregulated and standardized, so everyone just does whatever they want.
There are some languages that are better for different use-cases.
Like C/C++ are good for low-level programming that talks directly to hardware.
Java/C# are good for high level application programming.
Python is good for higher level scripting.
Personally, as a developer, I do get frustrated when people start new languages that don’t have a ‘significant’ change from an existing language. They could definitely build on an existing language. It’s not just developers that need to learn the new language. It’s the whole tool chain that often needs changing and updating from build tools, security scanning tools…
But the answer to you question is really that the field is largely unregulated and not standardized. Any random developer can start a new programming language and maybe it catches on. Also some companies start their own language because then it locks users into their ecosystem.
Latest Answers