World's most popular travel blog for travel bloggers.

[Solved]: Asymptotic expected runtime of Randomized Algorithm

, , No Comments
Problem Detail: 

I am analyzing the asymptotic runtime of a randomized algorithm in expectation. The algorithm has the following properties:

  • Given input size $n$, with probability $3/4$ it moves on to solve an instance of size $n-1$
  • With probability $1/8$ it moves on to solve an instance of size $n-2$
  • With probability $1/16$ it moves on to solve an instance of size $n-3$
  • With probability $1/2^i$ it moves on to solve an instance of size $n-i$
  • Each instance pays a cost of $O(n)$, where $n$ is the input size of that instance

Over expectation, the runtime can be defined recursively as follows:

$T(n) = O(n) + \sum\limits_{i=0}^{n-1} (\dfrac{1}{2^{n-i+1}} T(i)) + \frac{1}{2}T(i-1)$

$T(0) = O(1)$

I have calculated that the expected number of "jumps" at each stage is $\leq 1$. I did this by showing $\sum\limits_{i=0}^\infty \dfrac{i}{2^{i+1}}= 1$ by using telescoping and geometric series. However, since the complexity at each stage diminishes as $n$ gets smaller, although this hints the runtime is $O(n^2)$, it does not prove it. Anyone have any ideas to prove a runtime for the less relaxed version?

EDIT: Slight gap in my formulation. The "$3/4$" probability for moving onto an instance of size $n-1$ should actually be larger than $3/4$ since the probabilities $1/2, 1/4, 1/8, ...$ only go on till $1/2^{n+1}$. If no jumps were made, the algorithm deterministically moves on to an instance of size $n-1$.

Asked By : Bryce Sandlund

Answered By : Yuval Filmus

As you mention, it is easy to prove by induction that the overall runtime is $O(n^2)$. If we don't have any more information, that's (almost) all we can deduce. For example, suppose that at each instance we pay a cost of $O(1)$. The running time will then be $O(n)$. In order to make progress, we have to assume that at each instance we pay a cost of $\Theta(n)$. In that case, we can argue as follows:

  1. Since each "jump" is $O(1)$, it takes $\Omega(n)$ steps (on average) to reach $n/2$.
  2. At each of these $\Omega(n)$ steps, we pay a cost of $\Omega(n)$.
  3. Hence the total expected cost is $\Omega(n^2)$.

This is not a rigorous argument, but it can be turned into one if you're careful enough.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/33813

3.2K people like this

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback