Definitive Guide for Algo Analysis - Part One
Venkatesh Kamath
SDE 2 @ HashedIn by Deloitte | Tech Advisor | Web Engineer | WKC Scholar | Finalist of SAP Semicolon
Talking about the basic analysis of algorithms in greater depths
Why is it essential to talk about the Analysis of a problem or an algorithm?
For interviews? No for basic knowledge? no, not really. It is a foundation for problem-solving, for any given problem or set of problems, you need to know whether the given solution is indeed efficient or reliable, it is difficult to decide as a developer, which solution or a piece of code, I have to opt out of.
Consider an example, a standard one: Find the sum of natural numbers, given an input n. Take a moment, pause for a while, I have three solutions.
Solution 1(Math Formula)
int one(){
return n * (n+1)/2; //Straight forward math formula
}
Solution 2(Single loop)
int two(){
int sum = 0;
for(int i = 0; i<=n; i++){
sum = sum + i; //sum+=i
}
return sum;
}
Solution 3(Double loop)
int three(){
int count = 0;
for(int i = 0; i<=n; i++){
for(int j=1; j<=i ; j++){
count ++;
}
}
return count;
}
I hope all three functions are crystal clear, I mean there is nothing wrong as long as you get an answer for a given problem. All three codes are absolutely right ??. But understand, the first solution is much better, you might disagree but this is true. Please note that here we talk about larger values of n, not smaller ones. I will prove it to you ??.
Some Considerable Factors
Before we discuss which algorithm is better and why, etc, there are a few factors that should be considered important, and the speediness of a problem depends on this too:
领英推荐
In simple terms, even if you write the best algorithms, because of the above-mentioned reasons or factors, your algorithm might take more or less time. Every time, you need not write an algorithm to figure out which one is efficient. For that, we have 'Asymptotic Analysis'.
Asymptotic Analysis
Now, what the hell on earth is this? Basically, it measures the efficiency of an algorithm and has its own advantages, that is
Now, for any given problem, all the mathematic operations are constants, for instance above we had sum+i, a constant operation, count ++, and then we had some initialization, all these are negligible or constants, they do not impact us as much.
So, looking at Solution 1: It takes constant time because, for any large inputs, it is a one-line math formula, an operation that is constant. You have a million or so, doesn't really matter.
Sol1: C1 (Constant time)
Solution 2: Most of them are constants, but for loop, that runs 'n' times, creates some operation 'n'. Along with that, we have some constant operations, that is represented as C3. And the 'n' times operation, a loop, it is C2 * n.
Sol2: C2 * n + C3
Solution 3: This is a good solution, but if you notice, the operations are quite large, there is a double for loop, which makes n*n operations, C4n^2, the outer loop, that runs for n, C5*n, and some constant work that is done. C6.
There you go, just by looking at the solution, you get an idea, of which of the problems takes more effort and this is nothing but the 'Order of Growth' of an algorithm, but I will explore more on that part, next time. These constants can be dealt with in a better way. I will illustrate it in the next article.