Could some one please explain how to get the time complexity of checking if a number is prime? I'm really confused as to if it is $O(\sqrt{n})$ or $O(n^2)$.
I iterate from $i=2$ to $\sqrt{n}$ and continuously checking if n%i != 0
.
However, do we consider the complexity of the square root function also?
If we do, the Newtons method for finding square roots has a complexity of $n^2$.
Asked By : TdBm
Answered By : wythagoras
When $n$ is the input, we test $\sqrt{n}$ divisors. However, we also have to take into account the complexity of division itself. For a number with $b$ bits, this takes $O(b \log b \log\log b)$ operations, when we are using the Schönhage–Strassen algorithm. Since half of the numbers below $\sqrt{n}$ have $\log(\sqrt{n})$ digits, we may assume the all have this amount of digits. It only matters a factor $\frac12$ at most, and that is absorbed in the $O$.
This gives a computational complexity of $O( \sqrt{n} \log \sqrt{n}\log(\log(\sqrt{n}) ) \log\log(\log(\sqrt{n}) )))$. We can simplify this to $O( \sqrt{n} \log(n) \log(\log(n) ) \log(\log(\log(n) )))$.
However, when notating it using the number of bits $b$ of a number, which is more standard usage, we get a computational complexity of $O(2^{b/2} b \log b \log\log b)$.
You also need to compute $\sqrt{n}$ but that has only complexity $O(b \log b \log\log b)$ when using Newton's method and the Schönhage-Strassen algorithm.
However, this is not optimal. The current best algorithm, the AKS algorithm, reaches a complexity of less than $O(b^{7})$ worst-case , which is much faster. (To be precise, even $O(b^{6+\varepsilon})$ for any $\varepsilon>0$ has been shown, but that is more complicated than it needs to be for this answer.)
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/47975
0 comments:
Post a Comment
Let us know your responses and feedback