Why Data Structures and Algorithms?



I mean yes, it sounds dumb. But I recently got into programming, and with regular miniscule effort I have been able to grasp a good knack of concepts on programming. It has been a week sincr I started learning Data Structures and Algorithms, and every place or site I visit, I get recommended to master the concepts of this particular subject. Why is DSA so important in programming?

In: Other

Down in the weeds, all programs are using algorithms to manipulate data. Learning coding (how to execute a particular algorithm/data structure in a particular language) is necessary to implement a particular algorithm/data strutcure but, by itself, tells you nothing about why to use particular data structures or algorithms. Those are fundamental to the inherent capabilities of your program (how much storage it needs, how fast it can run, what it can do) i in a way that has nothing to do with what language you use or how well you code it.

It’s like the difference between understanding music theory and being able to play a musical instrument. You might be a virtuoso on the saxophone and be able to play any piece of music that’s put in front of you perfectly, but it doesn’t mean you can compose great music. There are way more great musicians than great composers…being a great composer is harder.

In computing, there are many ways to arrive at the same result but each different way has different computational complexities, memory requirements, time consistencies and security risks.

By understanding algorithms, you’ll know the right tools for the particular job you have in front of you, it will give you the ability to predict and diagnose problems with your program and also the basis to create your own algorithms.

Optimizing algorithms for a specific task can be incredibly rewarding. Cryptominer algorithms are a good example – the speed difference between when they were first created and now is over 1000x on the same hardware and still improving. Cryptonote specifically was never thought to be viable on anything but an x86 CPU due to its engineered affinity for using a fast and large cache, just a year later it was running a million times quicker on GPU’s which have really tiny caches and a few years later, it was running a further million times faster on ASIC’s with no memory at all.