From my college days, I always found myself struggling with Algorithms,their complexities,asymptotic notations and blah blah blah and still. But now after reading a lot of articles on this I can say that I learnt something. So, whatever I had learnt till now I am going to post it here for my future reference and for other geeks who are at the same place where I was few years back.
(Specifically for Beginners)May be you found this post very slow,but once u finished I am sure that today u had learnt something new.
If you feel like reading this post slowly and carefully, it will describe what this notation _really_ means.
All functions have some kind of behavior as n grows towards infinity. For example, if f(n) = 1/n, then as n grows towards infinity, f(n) gets closer and closer to zero. Whereas if f(n) = n*n, then as n grows towards infinity, f(n) grows too.
Functions can grow at different speeds. If two functions are equal, then they obviously grow at the same speed. But wait, there's more! Two functions are deemed to grow at the same speed if they're separated by a constant multiple! For example, if f(n) = n*n and g(n) = 3*n*n, then f and g are deemed to grow at the same pace, because g(n) = 3*f(n), so they are only a constant multiple apart. That is, g(n) / f(n) = 3, as n grows arbitrarily large.
But consider the scenario where f(n) = n * n, and g(n) = n. Then what is the behavior of f(n) and g(n) as n grows arbitrarily large? Well, f(n) / g(n) = (n * n) / (n). Which simplifies to f(n) / g(n) = n. Which means that as n grows large, f(n) = n * g(n). What does this mean? f and g are in this case not separated by a constant multiple. The multiple between f and g grows larger and larger as time progresses, without stopping. We say that "f grows faster than g" when this happens. Or "g grows slower than f". Just try some sample values of n: Try 10, 100, and 1000. First, f(10) / g(10) = 100 / 10 = 10. f(100) / g(100) = 10000 / 100 = 100. f(1000) / g(1000) = 1000000 / 1000 = 1000. We can easily see, hopefully, that the ratio between f and g is not constant.
Now consider the scenario where f(n) = 2 * n * n + 1, and g(n) = n * n. In this case, our functions have the ratio f(n) / g(n) = (2 * n * n + 1) / (n * n). What is the value of this ratio as n grows towards infinity? The answer is 2. Let's simplify the expression:
(2 * n * n + 1) / (n * n) =
(2 * n * n) / (n * n) + 1 / (n * n) =
2 + 1 / (n * n).
So f(n) / g(n) = 2 + 1 / (n * n).
So as n grows large, the term 1 / (n * n) gets arbitrarily (or rediculously) small. As n grows large, then, the value of (2 + 1 / n * n) gets closer and closer to 2.
We could plug in some values -- let's try 10, 100, and 1000 again.
f(10) / g(10) = (2 * 10 * 10 + 1) / (10 * 10) = 201 / 100 = 2.01
f(100) / g(100) = (2 * 100 * 100 + 1) / (100 * 100) = 20001 / 10000 = 2.0001
f(1000) / g(1000) = (2 * 1000 * 1000 + 1) / (1000 * 1000) = 2000001 / 1000000 = 2.0000001.
So the ratio between these two functions approaches a constant value as n grows large. Hence, f and g are said to grow at the same pace.
In comparing the growth rates of functions, we have these rules:
1. If f(n) / g(n) grows out of control, getting larger and larger as n gets larger, then f is said to grow faster than g.
2. If f(n) / g(n) settles towards some constant positive value, then f is said to grow at the same pace as g.
3. If f(n) / g(n) gets closer and closer to zero, then this means that its reciprocal, g(n) / f(n), is growing out of control, so g is said to grow faster than f. (Or f is said to grow slower than g.)
Now on to big O notation!
Big O notation actually refers to an entire _set_ of functions. The notation O(expression) represents the entire set of functions that grow slower than or at the same pace as expression. For example, O(n^2) represents the entire set of functions that grow slower than or at the same pace as n^2.
In other words, if g(n) = n, then since n grows slower than n^2, it follows that g lies in the set O(n^2). Likewise, if h(n) = 2 * n * n + 1, then since h grows at the same pace as n^2, it follows that h lies in the set O(n^2).
It's also true that h lies in the set O(n^3), because h grows slower than n^3. (Assuming the leading term is positive, quadratic functions always grow slower than cubic polynomials, which always grow slower than fourth-degree polynomials, which always grow slower than fifth-degree polynomials, and so on.)
Because O(expression) represents all the functions that grow _slower_ than or equal to expression, it is used to represent upper bounds.
The other notations refer to different sets of functions. For example, o(expression) represents all the functions that grow slower than expression (and not at the same speed.) f(n) = n^2 + n - 1 lies in the set O(n^2), but it doesn't lie in o(n^2). g(n) = n lies in both.
Theta(expression) represents all the functions that grow at the same rate as expression.
Omega(expression) represents all the functions that grow faster than or equal to expression.
In computer programs, we are of course interested in how much time it takes programs to run, and these forms of notation are useful for representing that, so that's why we use them. When you say an algorithm runs in O(expression) time, you're just saying that the algorithm's runtime is no worse than some multiple of expression. When you use Theta, you're saying that the algorithm's runtime is some multiple of expression. And Omega is used to say that the amount of time an algorithm takes to run grows at a rate that is larger than or equal to some expression's rate of growth.
Please feel free to post any comment or suggestions ...
Cheers!!!