I have been learning about Big O, Big Omega, and Big Theta. I have been reading many SO questions and answers to get a better understanding of the notations. From my understanding, it seems that Big O is the upper bound running time/space of the algorithm, Big Omega is the lower bound running time/space of the algorithm and Big Theta is like the in between of the two.
This particular answer on SO stumbled me with the following statement
For example, merge sort worst case is both ${\cal O}(n\log n$) and $\Omega(n\log n)$ - and thus is also $\Theta(n\log n)$, but it is also ${\cal O}(n^2)$, since $n^2$ is asymptotically "bigger" than it. However, it is NOT $\Theta(n^2)$, Since the algorithm is not $\Omega(n^2)$
I thought merge sort is ${\cal O}(n\log n)$ but it seems it is also ${\cal O}(n^2)$ because $n^2$ is asymptotically bigger than it. Can someone explain this to me?
Asked By : Computernerd
Answered By : tanmoy
This is actually a mathematical matter and not the matter of the worst case running time. First you have to know that Big O does not represent worst case running time of some algorithms. Big O comes from mathematics and it represents an asymptotic upper bound for some mathematical functions.
You can see that the graph of $n^2$ is always above the graph of $n\log n$. so you can say $n^2$ is an upper bound on $n\log n$ as the value of $n\log n$ is always less than $n^2$. So mathematically you can say $n\log n=O(n^2)$. Similarly graphs of all $n^i$ for $i\geq 2$ are always above the graph of $n\log n$. So you can say $n\log n$ is also $O(n^3),O(n^4),O(n^5),\ldots$
But one thing you have to remember that though $n\log n$ is $O(n^2)$ but $n\log n$ is not $\Theta(n^2)$, because $n\log n$ is not $\Omega(n^2)$. Mathematically you can say $n\log n$ is $o(n^2)$.
If you still have any confusion regarding my answer, you can ask me here.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/22662
0 comments:
Post a Comment
Let us know your responses and feedback