As a programmer with non CS background, I am learning algorithms.
When explaining the performance of quicksort in an Algorithm book and also elsewhere on the web, I do not see any reference to the time/space needed for shuffling. I do understand that shuffling is important to have the $O(n\log n)$ performance, but shuffling itself would have a complexity of $O(n)$. Why does this become irrelevant in the total performance?
Asked By : zencv
Answered By : David Richerby
Here's a little more detail than Dave Clarke's answer.
Remember that $O(n\log n)$ means that there are constants $c_1$ and $n_1$ such that, for all $n\geq n_1$, the running time of the sorting phase is at most $c_1n\log n$ steps.1 Similarly, the running time of the shuffling phase is at most $c_2n$ steps, for all $n\geq n_2$. That means that, as long as $n>n_1$ and $n>n_2$, the total running time is at most $c_1n\log n + c_2n$. As long as $n\ge2$, $\log n\ge1$ (I'm assuming logs to base 2; if you want to take logs to some other base $b$, you need $n\ge b$), we have \begin{equation*} c_1n\log n + c_2n \le c_1n\log n + c_2n\log n = (c_1+c_2)n\log n\,, \end{equation*} as long as $n\ge \max\,\{n_1, n_2, 2\}$.
We conclude that the total running time is $O(n\log n)$.
1 Strictly speaking, this is something like the expected running time: as is well known, the worst-case running time is actually $O(n^2)$.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/23387
0 comments:
Post a Comment
Let us know your responses and feedback