In a standard algorithms course we are taught that quicksort is $O(n \log n)$ on average and $O(n^2)$ in the worst case. At the same time, other sorting algorithms are studied which are $O(n \log n)$ in the worst case (like mergesort and heapsort), and even linear time in the best case (like bubblesort) but with some additional needs of memory.
After a quick glance at some more running times it is natural to say that quicksort should not be as efficient as others.
Also, consider that students learn in basic programming courses that recursion is not really good in general because it could use too much memory, etc. Therefore (and even though this is not a real argument), this gives the idea that quicksort might not be really good because it is a recursive algorithm.
Why, then, does quicksort outperform other sorting algorithms in practice? Does it have to do with the structure of real-world data? Does it have to do with the way memory works in computers? I know that some memories are way faster than others, but I don't know if that's the real reason for this counter-intuitive performance (when compared to theoretical estimates).
Update 1: a canonical answer is saying that the constants involved in the $O(n\log n)$ of the average case are smaller than the constants involved in other $O(n\log n)$ algorithms. However, I have yet to see a proper justification of this, with precise calculations instead of intuitive ideas only.
In any case, it seems like the real difference occurs, as some answers suggest, at memory level, where implementations take advantage of the internal structure of computers, using, for example, that cache memory is faster than RAM. The discussion is already interesting, but I'd still like to see more detail with respect to memory-management, since it appears that the answer has to do with it.
Update 2: There are several web pages offering a comparison of sorting algorithms, some fancier than others (most notably sorting-algorithms.com). Other than presenting a nice visual aid, this approach does not answer my question.
Asked By : Janoma
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/3
Answered By : Sebastian
Short Answer
The cache efficiency argument has already been explained in detail. In addition, there is an intrinsic argument, why Quicksort is fast. If implemented like with two "crossing pointers", e.g. here, the inner loops have a very small body. As this is the code executed most often, this pays off.
Long Answer
First of all,
The Average Case does not exist!
As best and worst case often are extremes rarely occurring in practice, average case analysis is done. But any average case analysis assume some distribution of inputs! For sorting, the typical choice is the random permutation model (tacitly assumed on Wikipedia).
Why $O$-Notation?
Discarding constants in analysis of algorithms is done for one main reason: If I am interested in exact running times, I need (relative) costs of all involved basic operations (even still ignoring caching issues, pipelining in modern processors ...). Mathematical analysis can count how often each instruction is executed, but running times of single instructions depend on processor details, e.g. whether a 32-bit integer multiplication takes as much time as addition.
There are two ways out:
Fix some machine model.
This is done in Don Knuth's book series "The Art of Computer Programming" for an artificial "typical" computer invented by the author. In volume 3 you find exact average case results for many sorting algorithms, e.g.
- Quicksort: $ 11.667(n+1)\ln(n)-1.74n-18.74 $
- Mergesort: $ 12.5 n \ln(n) $
- Heapsort: $ 16 n \ln(n) +0.01n $
- Insertionsort: $2.25n^2+7.75n-3ln(n)$
[source]
These results indicate that Quicksort is fastest. But, it is only proved on Knuth's artificial machine, it does not necessarily imply anything for say your x86 PC. Note also that the algorithms relate differently for small inputs:
[source]Analyse abstract basic operations.
For comparison based sorting, this typically is swaps and key comparisons. In Robert Sedgewick's books, e.g. "Algorithms", this approach is pursued. You find there
- Quicksort: $2n\ln(n)$ comparisons and $\frac13n\ln(n)$ swaps on average
- Mergesort: $1.44n\ln(n)$ comparisons, but up to $8.66n\ln(n)$ array accesses (mergesort is not swap based, so we cannot count that).
- Insertionsort: $\frac14n^2$ comparisons and $\frac14n^2$ swaps on average.
As you see, this does not readily allow comparisons of algorithms as the exact runtime analysis, but results are independent from machine details.
Other input distributions
As noted above, average cases are always with respect to some input distribution, so one might consider ones other than random permutations. E.g. research has been done for Quicksort with equal elements and there is nice article on the standard sort function in Java
0 comments:
Post a Comment
Let us know your responses and feedback