Big O

Big O

suppose that we have a code block which we want to measure its performance, the first thought jumps in our mind is how fast it’ll be executed, and it sounds good, but let’s discuss the speed performance.

Speed … is that all ?!

let’s have a code snippet and measure its execution speed first.

const {performance} = require('perf_hooks');

function print () {
    let x = 2 + 3;
    return x;
}
console.log(print())

it’s a very simple function, just an addition operation and returns the result, let’s measure its speed performance by just adding two extra lines

const {performance} = require('perf_hooks');

function print () {
    let x = 2 + 3;
    return x;
}
// start timer
let t1 = performance.now()
// execute the operation
console.log(print());
// calculate the time consumed in seconds
let t2 = performance.now();
console.log((t2 - t1)/1000);

now we’ll run it and see the result.

No alt text provided for this image


I ran the same code 7 times but with different results.

obviously, it’s logically true, your machine processor is not running only your code, try to run the same code, and although it’s very very simple if you dare to run a photoshop or something that can occupy most of the processor capacity, the calculated time will be even far larger than mine.

now assume you have two more complex code which implements some sorting algorithms, and you want to know which performs better than the other, the previous method clearly not the best, so time is not a good criteria to take in consideration because:

  • if you run the same code on different machines you will get different time results
  • if you run the same code on the same machine for many times you will also get different results
  • if your code block is not complex, the resulted time won’t be precise.


so, instead of counting time seconds, what if we count the operations?

Don’t count on time, count operations

let’s have a new snippet

function count (n) {
  let total = 0;
  for (let i = 0; i <= n; i++) {
    total += 1;
  }
  return total;
}

now we’ll count how many operations we’ll go through inside the function

function count (n) {
  let total = 0;  // 1 assignment operation
  for (let i = 0; i <= n; i++) { // 1 assigment operation, n comparison operations, n addition and assignments
    total += 1; // n assigment operation, and n addition
  }
  return total;
}

so we’ve 5n + 2 operations, now we have a clear method introduces a formula which can’t vary between machines or processors, cause the same number of operation will be executed anywhere regardless the environment.

in the previous example depending on the function input, the number of operations will grow as n grows, but what if we have a real-life code block that contains a lot of lines, do you expect to count every operation in every line! it’s crazy.


here it’s your big hero comes, the Big O.

Big O

big O solves the problem of counting operations, it only cares about the trends, let’s see how Big O expresses the Time Complexity.

Big O: Time Complexity

let’s plot the previous formula of the counted operations (5n+2)

No alt text provided for this image


hmmm, it seems like a linear function, as the number of inputs n increases, the number of operations also linearly increases, and as we mentioned before that Big O notation cares about the TREND not the DETAILS,

so instead of writing it like that O(5n+2) it will be written like that O(n)

let’s have more examples

n * (n + 1) / 2   // 1 multiplication operation, 1 addition, 1 division

in the previous example, we don’t have any loops, just 3 operations only, so it’s O(3), but because the number of operations is just a constant it’ll be written instead as O(1).

so, as we knew that Big O cares about trend so any loop will be generally considered as O(n), but what if we have 2 loops in sequence or 2 nested loops!

function anything(n) {

  for ( ..... ) {
    //code
  }
  
  for ( ..... ) {
    //code
  }
  
}

if we have two sequential loops, each has n repetition, so for sure each loop has O(n), but what’s the notation of the whole function, is it O(n+n) or O(2n)? actually no, cause as we mentioned before, we neglect the details, and it’s obvious that the number of operations proportionally increase linearly as the input increase, so it’s O(n).

another example

function anything(n) {

  for ( ..... ) {
    for ( ..... ) {
      //code
    }
  }
}

a nested loop, ok, each loop still has O(n), but does the whole function still has O(n) also?

let’s see it’s n repetitions inside another n repetitions, so we have O(n*n) or O(n^2), cause as the number of inputs increases, plot the number of operations, you will get a quadratic function.

another example

function anything1 () {
  for (let i = 0; i <= Math.max(5,n); i++) {
    // code
  }
}

function anything2 () {
  for (let i = 0; i <= Math.min(5,n); i++) {
    // code
  }
}

/*
* if you’re not familiar with JS, 
* Math.max(x,y) returns the larger number of the two numbers 
* and Math.min(x,y) returns the smaller number.
*/

for the first function anything1 the for loop repeats a number of times,

so if n = 1 the for loop repeats 5 times as 5 > 1

n = 2 the for loop repeats 5 times as 5 > 2 and so on, starting from 6, the number of operations will increase depending on the input n, let’s plot it.

No alt text provided for this image


so seems that starting from n = 6 the number of operations will increase linearly, if n = million the number of repetitions will be million, so the trending here is that we have a linear function so the first function has complexity O(n)

in the second function anything2 we have the opposite case, if n =1, the number of repetitions will be 1, n = 2 the number of repetitions equals 2, and so on, starting from n = 6 increasing to whatever the input, the number of the repetitions will be 5 as shown in the following graph

No alt text provided for this image


so it’s clear that the trending here as the input increases, the number of loops still equals 5

so the function complexity will be O(1) as it’s constant.

now let’s brief how Big O calculates or expresses the time complexity:

  • Forget the constants ….. only care about trend

O(2n) --------> O(n)

O(500) --------> O(1)

O(20n^2) --------> O(n^2)

O(20n+10) --------> O(n)

O(n^2+5n+10) --------> O(n^2)

  • Arithmetic operations, Logical operations, Assignment operations …. all have O(1)
  • In loops, whatever the operations code inside the loop, the loop time complexity is O(n) where n is the loop length, and for nested loops, the complexity is the product of each loop length.

till now we used Big O notation to analyze time complexity by giving a general expression of the operation's growth, Big O also can give us another benefit by expressing the space complexity, space complexity cares about how your code occupies your memory space, so let’s see how.


Big O: Space Complexity

note that when talking about space complexity, we aren't talking about the space occupied by the input, but we are talking about Auxillary Space Complexity which is the space occupied or used by your code algorithm

so let’s put new rules

  • Boolean, numbers, undefined, null …. are constant space, so it has space complexity O(1)
  • Strings, Array, some object operations (like return number of keys or array of keys) … has space comlexity O(n)

Example:

function count (n) {
  let total = 0;  // 1 number
  for (let i = 0; i <= n; i++) { // 1 number
    total += 1;
  }
  return total;
}

as you see in the previous example first we a memory allocated for a number assigned to the variable total then we have another memory space allocated for the iterator variable i we we have only 2 spaces, so we still follow the Big O notation biggest rule ‘Trend, not Details' so instead we write it as O(2), write as O(1) instead as it’s a constant.

another example

function myDoubleList (arr) {
  let list = [];
  for (let i = 0; i < arr.length; i++) { // 1 number
    list.push(arr[i] * 2);
  }
  return list;
}

we have an array list which has an O(n) where n is the number of the space occupied (array length) inside the array and 1 space for a number, so it’s O(n).

at the end of the day, we knew how Big O can express time and space complexity which helps to improve the performance of the code and defines a common base that can evaluate the code separate from the host environment.

you can check this cheatsheet for Big-O many data structures and algorithms.

https://www.bigocheatsheet.com/

要查看或添加评论,请登录

社区洞察

其他会员也浏览了