An Easy-To-Use Guide to Big-O Time Complexity

AKA The Space-Time Complexity

What is Time-Complexity?So your program works, but it?s running too slow. Or maybe your nice little code is working out great, but it?s not running as quickly as that other lengthier one. Welcome to Big-O Time Complexity, where repetition is the enemy, and arrays reign supreme. I?m kidding?kind of. Big-O notation is a common means of describing the performance or complexity of an algorithm in Computer Science. Practically speaking, it is used as a barometer to test the efficiency of your function, and whether or not it can be improved upon. Time-Complexity will become an important aspect of your dev environment as your continue to progress through your studies. Eventually, you?ll be able to analyze your program and denote which programs fall within what time complexity (constant, linear, quadratic, exponential, logarithmic). In this guide, we?ll be breaking down the basics of Time-Complexity in order to gain a better understanding of how they look in your code and in real life.

O(1) ? Constant Time:Constant Time Complexity describes an algorithm that will always execute in the same time (or space) regardless of the size of the input data set. In JavaScript, this can be as simple as accessing a specific index within an array:

var array = [1, 2, 3, 4, 5];array[0] // This is a constant time look-up——————————————————————–var removeLastElement = function(array) { var numberOfElementsToRemove = 2;while (numberOfElementsToRemove > 0) { array.pop(); }}; //This will also have a constant time look-up since the function //is only looking at a specific reference point within the array.

It doesn?t matter whether the array has 5 or 500 indices, looking up a specific location of the array would take the same amount of time, therefore the function has a constant time look-up.

In much the same way, if I had a deck of cards, and I asked you to remove the first any card at random, you could simply grab a card from the deck without having to search through the entire deck. That is a constant time look-up.

O(N)?Linear Time: Linear Time Complexity describes an algorithm or program who?s complexity will grow in direct proportion to the size of the input data. As a rule of thumb, it is best to try and keep your functions running below or within this range of time-complexity, but obviously it won?t always be possible.

var numberRange = function(array) { for (var i = 1; i < array.length; i++) { console.log(array[i]); }}; //This will have a linear time look-up since the function is //looking at a every index in the array once.

In this example, the look-up time is directly related to the size of our input because we will be going through each item within the array. In other words, the larger the input, the greater the amount of time it takes to perform the function. Of course, if the array only had 1 index (i.e. array.length === 1), then the function would have a constant time look-up.

Now, back to our deck of cards analogy. If I had a deck of cards and I wanted you to select a specific card (let?s say the 10?s). You would have to look through every card until you found that card. Sure, there?s a possibility that it would be the first card in the deck, but that?s highly unlikely. Now think about if the deck of cards were filled with hundreds of other cards non-10? cards. Your search is directly related to how large the deck of cards is. That is an example of Linear Complexity.

O(log(n)) ? Logarithmic Time:Logarithmic Time Complexity refers to an algorithm that runs in proportionally to the logarithm of the input size. Now, if you haven?t worked with Binary Search Trees, this may be a little difficult to conceptualize, so I?ll try and demonstrate it clearly below.

//NOTE: This function is not an example of a Binary Search Treevar partialRange = function(array) { for (var i = 1; i < array.length; i = i * 2) { console.log(array[i]); }}; //This will have a logarithmic time look-up since the function is //looking at only searching through part of the array.

In the function above, we are logging numbers up until the array length is passed. However, we can see that our function will only be logging numbers at a certain interval. Since we will be repeating a constant time look-up on multiple parts of the array, our function will not have a constant time look up. It should also be noted that since our function does not pass through every index in the array, it will not have a linear time look-up.

Using our deck of cards analogy, let?s say our cards are ordered from increasing to decreasing order, with the deck split in half, one pile containing the clubs and spades, the other containing the hearts and diamond cards. Now, if I asked you to pull out the 10?s, you could safely conclude that the card should be in the second pile, since that?s where the diamonds and hearts cards are. Furthermore, you know that the card should only be with the hearts, so you split the deck to only search through the heart cards. Since you?ll be searching for the 10?s, you can safely assume that the bottom half of the deck is irrelevant (since they deck is already sorted). You can continue to split the remaining cards until you eventually find the 10?s. You didn?t have to search through every card to find it, but it also wasn?t as easy as simply grabbing a random card. That is an example of Logarithmic Time Complexity.

O(N) ? Quadratic Time:Quadratic Time Complexity represents an algorithm whose performance is directly proportional to the squared size of the input data set (think of Linear, but squared). Within our programs, this time complexity will occur whenever we nest over multiple iterations within the data sets.

var hasDuplicates = function(array) { for (var i = 0; i < array.length; i++) { var item = array[i]; if (array.slice(i + 1).indexOf(item) !== -1) { return true; } } return false;}; //This will have a quadratic time look-up since the function is //looking at a every index in the array twice.

This function is a great example of how easy it is to pass through an array twice without using two for loops. It?s clear in the first part that our function will be searching through the array at least once, but the difference-maker lies in the if statement. It is here where we are performing another look-up of the array with the native indexOf method. In doing so, we are passing over the same array twice with each search, producing a Quadratic Time Complexity.

Let?s revisit our deck of cards. Let?s say you picked the first card in the deck, and how to remove all the cards with that same number. You would have to search through the deck, removing the duplicates at every point. Once you were sure that all the duplicates were removed, you?d continue doing the same thing for second card, and third, and so on until you reached the end of the deck. This is an example of Quadratic Time Complexity.

O(2^N) ? Exponential Time Exponential Time complexity denotes an algorithm whose growth doubles with each additon to the input data set. If you know of other exponential growth patterns, this works in much the same way. The time complexity starts off very shallow, rising at an ever-increasing rate until the end.

var fibonacci = function(number) { if (number <= 1) return number; return Fibonacci(number – 2) + Fibonacci(number – 1);}; //This will have an exponential time look-up since the function //is looking at a every index an exponential number of times.

Fibonacci numbers are a great way to practice your understanding of recursion. Although there is a way to make the fibonacci function have a shorter time complexity, for this case we will be using the double recursive method to show how it has an Exponential Time Complexity. In a few words, the recursive instantiation will pass over every number from the number to zero. It will do this twice (once for each recursive call) until it reaches the base case if statement. Since the number of recursive calls is directly related to the input number, it is easy to see how this look-up time will quickly grow out of hand.

If you?re still looking for a quick reference to help you understand this (or any of the time complexities) better, I?d highly recommending checking out the Big-O Cheat Sheet. Hope that helped!


No Responses

Write a response