To share or not to share.... lol.
Ok, since it's time for final exams for many programming students, I'll be sharing some techniques in determining the Big-O notation of an algorithm which you can add to your list, if it's not there yet. It's also helpful in optimizing solutions for those who have programming gigs (and a stubborn persistence not to deliver defective products).
Anyway, if you don't need a refresher for Big-O and you already know what it is, scroll down and skip to the DESCRIPTION part.
INTRO
In Computer Science, Big-O notation is used to classify algorithms based on its complexity measured through time and space. It is invented by German theoretician Edmund Landau. It describes the order of growth, or growth rate T(n) of a certain function or algorithm as n reaches infinity.
It's like putting an equipment under stress test and quantifying its efficiency as you increase the level of stress. You can think of it determining the growth rate of the amount of time required to deliver mails from one city to another, as the number of mails increase. The slower the growth rate is, the faster the algorithm. Contrary to that, the faster the growth rate is, the slower the algorithm: the growth rate of speed becomes larger as speed decreases. The proof becomes more visible as the problem size (n) becomes larger.
For starters, you should be familiar with the type of time for most commonly used algorithms, as listed below in the order of slowest to fastest growth:
1. O (1) - Constant
Example: pop or append an item to a linked list
2. O (log n) - Logarithmic
Example: binary search
3. O (n) - Linear
Example: linear search
4. O (n log n) - Linearithmic / Loglinear / Quasilinear
Example: binary merge sort
5. O (n^2) - Quadratic
Example: bubble sort
6. O (2^n) - Exponential
Example: Tower of Hanoi
7. O (n!) - Factorial
Example: enumerate all partitions of a set
DESCRIPTION
There are many techniques in determining the complexity of an algorithm, and you may find a need to do more than one as more scenarios are presented. But here are the 2 main things you should watch out for:
1. Type of Statements
Example: loops, conditionals, function calls, etc.
2. Sequence of Statements
Example: arithmetic operations on n
And here are the reasons for that: the type of statements can determine a growth rate, and the sequence of statements in relation to the type of statements (blocks) can also determine a growth rate.
As an example, can you tell if there is a difference between the two loops described below?
Ok, since it's time for final exams for many programming students, I'll be sharing some techniques in determining the Big-O notation of an algorithm which you can add to your list, if it's not there yet. It's also helpful in optimizing solutions for those who have programming gigs (and a stubborn persistence not to deliver defective products).
Anyway, if you don't need a refresher for Big-O and you already know what it is, scroll down and skip to the DESCRIPTION part.
INTRO
In Computer Science, Big-O notation is used to classify algorithms based on its complexity measured through time and space. It is invented by German theoretician Edmund Landau. It describes the order of growth, or growth rate T(n) of a certain function or algorithm as n reaches infinity.
It's like putting an equipment under stress test and quantifying its efficiency as you increase the level of stress. You can think of it determining the growth rate of the amount of time required to deliver mails from one city to another, as the number of mails increase. The slower the growth rate is, the faster the algorithm. Contrary to that, the faster the growth rate is, the slower the algorithm: the growth rate of speed becomes larger as speed decreases. The proof becomes more visible as the problem size (n) becomes larger.
For starters, you should be familiar with the type of time for most commonly used algorithms, as listed below in the order of slowest to fastest growth:
1. O (1) - Constant
Example: pop or append an item to a linked list
2. O (log n) - Logarithmic
Example: binary search
3. O (n) - Linear
Example: linear search
4. O (n log n) - Linearithmic / Loglinear / Quasilinear
Example: binary merge sort
5. O (n^2) - Quadratic
Example: bubble sort
6. O (2^n) - Exponential
Example: Tower of Hanoi
7. O (n!) - Factorial
Example: enumerate all partitions of a set
DESCRIPTION
There are many techniques in determining the complexity of an algorithm, and you may find a need to do more than one as more scenarios are presented. But here are the 2 main things you should watch out for:
1. Type of Statements
Example: loops, conditionals, function calls, etc.
2. Sequence of Statements
Example: arithmetic operations on n
And here are the reasons for that: the type of statements can determine a growth rate, and the sequence of statements in relation to the type of statements (blocks) can also determine a growth rate.
As an example, can you tell if there is a difference between the two loops described below?
PROCEDURE A for(int i=0;i<25;i++) { System.out.print(" * "); } | PROCEDURE B for (int i = 21; i >= 3; i-=3) { System.out.println(" * "); } |
The answer is no. PROCEDURE A has a loop type of statement and iterates 25 (n) times; and the sequence of statements inside PROCEDURE A's loop does not affect the number of iteration that PROCEDURE A does. In the same way, PROCEDURE B has a loop type of statement and iterates 7 (n) times; and the sequence of statements inside PROCEDURE B's loop does not affect the number of iteration that PROCEDURE B does. Currently, the growth rate of PROCEDURE A is the same as PROCEDURE B, which is O (n).
Now, if we add or change the statement inside the loop of PROCEDURE A, and it changes the total number of iteration that PROCEDURE A does, then, there will be a difference. And the growth rate will change depending on the value of shift from the original number of iteration.
For example, if we add an addition statement to PROCEDURE A which increases the value of i by 4 at each iteration, the total number of iterations for PROCEDURE A will be 5: the growth rate has changed from O (n) to O (log_5 n). Please note that the value of i does not matter---the only thing that matters here is the difference in the total number of iterations done by the ORIGINAL PROCEDURE A and the NEW PROCEDURE A.
CHECK:
log_5 (25) = 2
5^2 = 25
Now, if we add or change the statement inside the loop of PROCEDURE A, and it changes the total number of iteration that PROCEDURE A does, then, there will be a difference. And the growth rate will change depending on the value of shift from the original number of iteration.
For example, if we add an addition statement to PROCEDURE A which increases the value of i by 4 at each iteration, the total number of iterations for PROCEDURE A will be 5: the growth rate has changed from O (n) to O (log_5 n). Please note that the value of i does not matter---the only thing that matters here is the difference in the total number of iterations done by the ORIGINAL PROCEDURE A and the NEW PROCEDURE A.
CHECK:
log_5 (25) = 2
5^2 = 25
NEW PROCEDURE A for(int i=0;i<25;i++) { i+=4; System.out.print(" * "); } Total # of Iterations: 5 | ORIGINAL PROCEDURE A for(int i=0;i<25;i++) { System.out.print(" * "); } Total # of Iterations: 25 |
Summary of Steps:
1. Define the type of statements used.
2. Drop constants and all other components that does not affect the number of operations required in relation to processing n.
3. Calculate growth rate at every check point, if there are any:
Some great references:
Oh, look at that. Time flies quick. And I still have a few modules to review so... later! =D
Best wishes to everyone doing finals! ^^,)
1. Define the type of statements used.
2. Drop constants and all other components that does not affect the number of operations required in relation to processing n.
3. Calculate growth rate at every check point, if there are any:
- Sequence of statements
- Conditional statements
- Loops
- Function calls
Some great references:
- http://web.mit.edu/16.070/www/lecture/big_o.pdf
- http://cooervo.github.io/Algorithms-DataStructures-BigONotation/index.html
- http://bigocheatsheet.com/
Oh, look at that. Time flies quick. And I still have a few modules to review so... later! =D
Best wishes to everyone doing finals! ^^,)