The tale of big-O: navigating the world of asymptotic notations in computer science.

Once upon a time, in a world bustling with complex algorithms and data structures, there was a severe lack of a standard way of measuring and comparing their efficiency. Programmers in different parts of the world grappled with this challenge. Without a standard way of expressing algorithmic complexity, this task proved daunting and often led to more confusion and lesser efficiency.

This continued for a while, until a group of trailblazers headed by Paul Bachmann, a German mathematician, introduced the concept of asymptotic notations, which describes the behavior of mathematical functions as their inputs grew.

Some years later, another mathematician, Paul Pierre Levy, eased the world’s pain further by expanding the notation in the early 20th century to include the now-familiar Big-O, Big-Theta, and Big-Omega. Paul Levy’s works laid the groundwork for modern algorithm analysis and provided a standardized way to express algorithmic complexities— talk about the Doctors of the world of computer science.

Asymptotic notation is used to describe the running time of an algorithm, and how much time an algorithm takes with a given input, N. Asymptotic notation is a fundamental concept in computer science and programming that helps us to analyze and describe the efficiency of algorithms.

It provides a standardized method of expressing how the runtime or resource usage of an algorithm grows as the size of the input data increases. It is a borrowed mathematical concept used to measure an algorithm’s efficiency.

In describing algorithmic complexities, there are different notations:

  • Big-O: Despite being just one out of the three major asymptotic notations, Big-O is the most popular. Ask me why. I don’t know either. But what I do know is what it does. It describes the worst-case running time of an algorithm and the longest possible time an algorithm may take to run, given an input of variable length denoted by N.

    Longest running time... maybe that’s why it is so popular, programmers must always plan for the worst-case scenario.

  • Big-Theta: Another of the three musketeers. This represents the average, typical case performance for an algorithm. It is defined as the tightest bounded and the tightest bound is the best of all worst-case run times an algorithm can take.

  • Big-Omega: Lastly, Big-Omega. In mathematical terms, it provides the asymptotic lower bounds of a function, ie, the best-case running time.

It is important to note that it is impossible to predict the exact behavior of an algorithm. There are too many variables to be considered for any given machine, analysis using asymptotic notation is only an approximation. But let’s take a moment to look at the flip side. The world without Asymptotic notations.

Bob just built a wonderful piece of software that can solve most of man’s problems. But there is one little problem. The code is too slow. So he builds the same solution using different algorithms. All he has to do is to check which one is more efficient.

Easy-peasy, init? But in this dystopian era, there is no standardized way of measuring the efficiency of an algorithm in terms of runtime and memory usage. Bob decides to try different his different algorithms using different methods in a bid for a solution.
For some inputs, he notices that the first algorithm performs better than the second one on different machines.

Also, for some inputs, the first algorithm performs better than the other ones, and vice versa for different inputs. It was an unfortunate state to be stuck in, and Bob was slowly losing his mind. Just before falling into a state of perpetual despair, he went searching for answers. In his quest, he stumbled across a secret library, and within its walls, he found a book about asymptotic notations, and within it, all his questions were answered.

His software solution catalyzed the development of technological innovations to the point we have today. Bob went ahead to win an intergalactic space award for best innovator of the universe... People in the world of Computer science believe that he has a seat right next to God...

Analysis of algorithms is very important. It is not just some stuff made up to keep the idiots like up out of computer science. It helps us predict the behavior of an algorithm without implementing it on a specific computer.

It is important to note that, it is impossible to predict the exact behavior of an algorithm. For any given machine, there are too many influencing variables, which makes analysis only an approximation.

You may be asking, there are so many important things that should be taken care of, like user experience and security, so why worry about performance? The thing is, performance is the elixir of software development. It is easier to get all other features implemented if we have good performance.