Big Oh (BOh) is a way to describe how long it takes for an algorithm to solve a problem when the input size is very large. Say an algorithm takes 10 seconds to solve a problem when the input size is 100 but it takes 30 seconds to solve the same problem when the input size is 1000 we would say that the algorthm has a BOh of 30 seconds.

Big Omega (BOm) is similar to BOh but it describes how long it takes for an algorithm to solve a problem when the input size is very small. So if an algorithm takes 10 seconds to solve a problem when the input size is 100, but it only takes 5 seconds to solve the same problem when the input size is 50, we say that the algorithm has a BOm of 5 seconds.

And finally, Big Theta (BTh) describe how long it takes for an algorithm to solve a problem when the input size is very large AND very small. If an algorithm has a BTh of 10 seconds it means that it takes about 10 seconds to solve the problem no matter how big or small the input size is.

In an ideal world we want algorithms to have a small BOh, BOm, and BTh because that means they will solve problems quickly regardless of input size.

Big theta notation (and the others) describe how the runtime of an algorithm scales as the input size increases. A theta(1) algorithm takes the same amount of time no matter how many elements there are. Getting a value from an array is the same whether it is the 3rd or 300th element. A theta(n) algorithm’s runtime scales the same as the input. If you are finding the maximum value in an array, you have to check each number to find it, so it scales with the input. More complicated algorithms can have higher and higher big theta. A selection sort (not the best sorting algorithm but a good example) will go through each number in an list, find the lowest, then add it to a new list, repeating until the new list contains each element in order. This requires going through a list of n items n times so it is theta(n^2). When you double the size of the input, it will take about 4x as long to complete. Similarly when you have a theta(n^3) algorithm, doubling the input will multiply the runtime by 8.

There is a more precise mathematical definition of big theta notation, but this understanding is good enough for now. Big O(n) is basically an upper bound on how the runtime grows, and big omega is basically a lower bound on how the runtime grows. An O(n) algorithm is known to take theta(n) time or less, an Omega(n) algorithm is proven to take theta(n) time or more. If a problem is both O(n) and Omega(n), then it is also theta(n).

Big O, Big Omega, and Big Theta are ways of expressing the asymptotic complexity of algorithms, which refers to how the runtime of an algorithm scales as the input size increases. These notations provide a way to compare the efficiency of different algorithms and to understand how well an algorithm will perform as the input size grows.

Here is a brief overview of each notation:

Big O: Big O notation is used to describe an upper bound on the asymptotic complexity of an algorithm. It tells you the worst-case scenario for the runtime of an algorithm. For example, if an algorithm has a complexity of O(n), it means that the runtime will never exceed a certain multiple of n, where n is the size of the input.

Big Omega: Big Omega notation is used to describe a lower bound on the asymptotic complexity of an algorithm. It tells you the best-case scenario for the runtime of an algorithm. For example, if an algorithm has a complexity of Ω(n), it means that the runtime will always be at least a certain multiple of n, where n is the size of the input.

Big Theta: Big Theta notation is used to describe a tight bound on the asymptotic complexity of an algorithm. It tells you both an upper and lower bound on the runtime of an algorithm. For example, if an algorithm has a complexity of Θ(n), it means that the runtime will always be between a certain multiple of n and another multiple of n, where n is the size of the input.

In general, you want an algorithm to have a low asymptotic complexity, which means that it will scale well and be efficient even for large input sizes.

## Latest Answers