How would I solve the following.
An algorithm that is $O(n^2)$ takes 10 seconds to execute on a particular computer when n=100, how long would you expect to take it when n=500?
Can anyone help me answer dis.
Asked By : Fernando Martinez
Answered By : AJed
Since it is $O(n^2)$, then $t \leq c n^2$. Therefore, since $t = 10$, and $n = 100$, $c \geq t / n^2 = 10 / 100 00$.
Therefore, at $n = 500$. You have $t \leq 10 / 100 00 \times 500^2 = 250$.
But wait, the definition of the $O$-notation claims that the above formula works for large $n$'s only. I assumed that. Although honestly, I dont like this question. It is a bet weird.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/11781
0 comments:
Post a Comment
Let us know your responses and feedback