Understanding the big-O with JavaScript

Finding a complexity for a program is always an important task. It gives us clear indication of how much slower or faster our program will get when you give more data input at it. When you write a performance-sensitive program, it is always a good idea to keep in mind it’s Algorithmic Complexity. In general we measure complexity with big-O-notation.

Generally the speed of a program depends on various components. Such as,

  • Compiler.
  • Others running program.
  • Hardware.
  • Browser etc.

big-O doesn’t care about any of this component. It does care about the input size, based on the input size the big-O gives us a good indication of run time. When developing an application the big-O describes how much an API slows you down as your codebase grows(Learn from Dan’s blog).

There are two kinds of complexity.

  • Time Complexity
  • Space Complexity

We will go through Time Complexity.

O(1) – Constant Time Complexity

This will work when you do an assignment operation, calling a function, mathematical operation.

Example of O(1)

function doSomething() {
    let name = "Mark"
    let result = 5 + 5
}

In this example the name variable is assigned to a string and result variable did a mathematical operation, so for name variable the complexity is O(1) and for result variable the complexity is O(1), we can say that the complexity is O(1).

Another example of O(1)

let numbers = [1, 2, 3, 4, 5]

function getNumber(val) {
    console.log(val)
}

getNumber(numbers[0])

The complexity of getNumber() function is O(1). If the size of numbers array increases the time it takes to run the getNumber() function won’t change, as it always show the first index value. No matter what the input size the complexity of getNumber() is constant.

Another useful example of O(1) is HashMap. For insert, delete, access or search the complexity is O(1).

The graph of O(1)

From this graph we can say that the more input size we increase the runtime increases consistently.

O(n) – Linear Time Complexity

Based on the input size the O(n) complexity will work. If you have an array contains 50 elements and you want to print that array, you can use a for loop that traverse 50 elements. Same way if you have 100 elements in an array you can traverse the elements with a for loop. Look at the numbers are 50 and 100, if those numbers increase the complexity increases at same rate.

// Example of O(n)

function printNumbers(arr) {
    for(let i = 0; i < arr.length; i++) {
        console.log(arr[i])
    }
}

printNumbers([1, 2, 3, 4])
printNumbers([1, 2, 3, 4, 5])

In this example the complexity of printNumbers() function is O(n), for different length of array we need to perform same length of operations(as proportional).

So for inserting, deleting or searching the complexity of an Array is O(n) in a worst case scenario.

The graph of O(n)

From this graph we can say that the more input data we give the more runtime increases linearly.

O(log n) – Logarithmic Time Complexity

The O(log n) is much efficient. It decreases the runtime when we increase the input data.

An example of it Binary Search.

If we have a sorted array of 1,000 elements and we are trying to find an element from it, the first work is to find the mid point of that array and check if the desired element is greater or lesser than the element of mid(point). If it(mid) is grater than the element to find, cut the left elements else cut the right elements and repeat that process again until we find the desired element.

function findElement(arr, desired_element) {

    let min = 0, max = arr.length, current_index;

    while (min <= max) { 
        current_index = Math.floor((min + max) / 2); 
  
        if (arr[current_index] === desired_element) {
            return current_index;
        }
  
        if (arr[current_index] < desired_element) {
            min = current_index + 1; 
        }
  
        else {
            max = current_index - 1
        }       
    } 
}

In each iteration we are cutting down half of elements, this can be huge for performance.

The graph of O(log n)

From this graph we can say that the more input we increase the runtime decreases.

O(n2) – Quadratic Time Complexity

This is inverse of Logarithmic Complexity. In this complexity the run time is proportional to the square of the input size. So this performs very slow on a large data set.

An example of O(n2)

function doSomething(){
    let sum = 0;
    for(let i = 0; i < 5; i++) {
        for(let j = 0; j < 5;j++) {
            sum = sum + 1
        }
    }
    return sum;
}

doSomething()

In this example if the input is 4 then the output is 16, for input 5 the output is 25. So the growth rate is n2 and the complexity is O( n2 ).

The graph of O(n2)

From this graph we can say that the more input we increase the runtime increases at square rate of input size.

Complexities for different data structures.

NameInsertAccessSearchDelete
ArrayO(n)O(1)O(n)O(n)
HashMapO(1)O(1)O(1)O(1)
Linked List(singly)O(1)O(n)O(n)O(1)
Stack(array)O(1)O(1)
QueueO(1)O(n)O(1)

Some resources.

Leave a Comment

Your email address will not be published. Required fields are marked *