World's most popular travel blog for travel bloggers.

[Answers] Is it enough to show the number of steps for certain values of $n$ in order to state an algorithm's complexity?

, , No Comments
Problem Detail: 

If I can easily state the number of steps for an algorithms for certain values of $n$, e.g. for $n = 2^k$, where $k$ is a whole number, the number of steps is $n\log n$, is this enough to allow me to state the complexity is amortized $O(n\log n)$?

I have a tree building algorithm - when adding elements a rebalancing operation occurs every so often, with those operations getting progressively more expensive as the tree size grows close to a power of two elements (before becoming cheaper again once we pass such a power). At every power of two elements its easy to state the number of steps done so far but in between it's not so easy.

I'm slightly embarrassed by this question as I currently assume it is OK to state an amortized time of $O(n\log n)$ for this situation but my conclusions, from reading a discussion elsewhere (see next part), seemed to contradict this.


Update: I've move the second part of part of this question, relating to Conc-tree lists and my confusion over how such trees could be built in $O(n)$, into a separate question.

Asked By : George Hawkins

Answered By : Yuval Filmus

You are asking several questions. I will only answer some of them.

Suppose that the running time of an algorithm as a function of $n$ satisfies $T(2^k) = O(2^k \log 2^k)$. Is it true that $T(n) = O(n \log n)$? Without any more information on the running time, this is patently false. For example, consider the function $$ T(n) = \begin{cases} n\log n & \text{if $n$ is a power of $2$}, \\ n^2 & \text{otherwise}. \end{cases} $$ However, in many cases we would expect $T$ to be monotone in $n$. In that case, the deduction does hold. Suppose that $T(n) \leq Cn\log n$ whenever $n$ is a power of $2$. Given arbitrary $n$, we can find $m \leq 2n$ which is a power of $2$, and then $$ T(n) \leq T(m) \leq Cm\log m \leq 2Cn \log n + O(n) = O(n\log n). $$

Suppose now that there is a data structure such that the first $n$ operations take time at most $Cn$ whenever $n$ is a power of $2$. If we let $T(n)$ denote the maximal time that $n$ operations take, then $T$ satisfies the conditions stated above (in particular, monotonicity is built-in), and so the amortized time per operation is indeed $O(1)$. Your case is a bit more complicated since different operations have different amortized running times, but the conclusion should be the same.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/65548

3.2K people like this

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback