World's most popular travel blog for travel bloggers.

[Solved]: If algorithm runs $\theta(n)$ in time T, doubling input size has what effect on time T?

, , No Comments
Problem Detail: 

In other words, is there a relationship between the step size and the actual running time? Suppose that the algorithm is run on identical machine.

Asked By : Beached Whale

Answered By : David Richerby

There's no clear relationship, except that bigger inputs will tend to take longer. What we call the "running time" in complexity theory is the number of basic operations but we just choose some set of basic operations and assume that they all have the same cost. That's fine for asymptotic analysis but it doesn't correspond to the actual running time, as measured by a clock.

For example, imagine your algorithm consists of an initialization phase that takes one second, regardless of the input, plus a quadratic time loop that takes $n^2$ milliseconds to process $n$ items. Complexity theory just says that your algorithm has running time $\Theta(n^2)$, so doubling the size of the input multiplies the running time by four. Asymptotically, this is true but, if you double the input size from $10$ to $20$, the running time only changes from $1.1\,\mathrm{s}$ to $1.4\,\mathrm{s}$, which is nothing like a factor of four (it's less less than $30\%$).

Another problem is that complexity theory doesn't model the sheer... well, complexity of modern computers. A simple example: suppose your quadratic algorithm runs in half a second on an input of length $\ell$. Great, it's going to run in just over half a second on a similar input of length $\ell+1$, right? Maybe. Or maybe it takes ten seconds because your data doesn't fit in the cache any more.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/35392

0 comments:

Post a Comment

Let us know your responses and feedback