World's most popular travel blog for travel bloggers.

# [Answers] Asymptotic Runtime of Interrelated Functions

, ,
Problem Detail:

I have two functions $S$ and $T$ which are interrelated and I want to find the asymptotic worst case runtime. The fact that they are interrelated is stumping me...

How would I find the asymptotic runtime $S(n)$ and $T(n)$?

\begin{align*} S(n) &= 2S(n/4) + T(n/4) \\ T(n) &= S(n/2) + 2T(n/2) \end{align*}

#### Answered By : Yuval Filmus

You can expand the definition of $T(n)$ as follows: \begin{align*} T(n) &= S(n/2) + 2T(n/2) \\ &= S(n/2) + 2S(n/4) + 4T(n/4) \\ &= S(n/2) + 2S(n/4) + 4S(n/8) + 8T(n/8) \\ &= \dots \\ &= S(n/2) + 2S(n/4) + 4S(n/8) + \dots. \end{align*} Substituting this in the definition of $S(n)$, we get $$S(n) = 2S(n/4) + S(n/8) + 2S(n/16) + 4S(n/32) + \dots.$$ Let's try to estimate the value of $S(n)$. Guessing $S(n) = n^\alpha$ and eliminating the common term $n^\alpha$, we get \begin{align*} 1 &= \frac{2}{4^\alpha} + \frac{1}{8^\alpha} + \frac{2}{16^\alpha} + \frac{4}{32^\alpha} + \dots \\ &= \frac{2}{4^\alpha} + \frac{1}{8^\alpha} \left(1 + \frac{2}{2^\alpha} + \frac{4}{4^\alpha} + \dots\right) \\ &= \frac{2}{4^\alpha} + \frac{1}{8^\alpha} \frac{1}{1-2/2^\alpha} \\ &= \frac{2}{4^\alpha} + \frac{1}{8^\alpha-2 \cdot 4^\alpha} \\ &= \frac{2\cdot 8^\alpha - 3 \cdot 4^\alpha}{32^\alpha -2\cdot 16^\alpha} \\ &= \frac{2\cdot 2^\alpha - 3}{8^\alpha -2\cdot 4^\alpha}. \end{align*} Letting $\beta = 2^\alpha$, we get the equation $2\beta - 3 = \beta^3 - 2\beta^2$, with solutions $\beta = 1, (1\pm \sqrt{13})/2$ and so $\alpha = 0, \log_2(1+\sqrt{13})-1$. The second solution is $\alpha \approx 1.20337$, and is the dominant one. We conclude that $S(n) \approx n^\alpha$, at least in the sense that $\log_n S(n) \to \alpha$. (See below for a more thorough argument.)

Since $S(n) \geq T(n/4)$ and $T(n) \geq S(n/2)$, also $\log_n T(n) \to \alpha$.

I have claimed that the recurrence for $S(n)$ implies a certain asymptotics for $S(n)$. Here is how to formally prove this using the Akra–Bazzi theorem. We can cut the recurrence after $k+2$ terms to obtain a lower bound recurrence: $$S_k(n) = 2S_k(n/4) + S_k(n/8) + \dots + 2^kS_k(n/2^{k+3}).$$ Clearly $S(n) \geq S_k(n)$. The Akra–Bazzi theorem states that $S_k(n) = \Theta_k(n^{p_k})$, where $p_k$ is the unique solution of $$\frac{2}{4^{p_k}} + \frac{1}{8^{p_k}} + \dots + \frac{2^k}{2^{(k+3)p_k}} = 1.$$ We conclude that $S(n) = \Omega_k(n^{p_k})$ for all $k$. As $k\to\infty$, it is not hard to check that $p_k \to \alpha$, and so $S(n) = \Omega_{\epsilon}(n^{\alpha-\epsilon})$ for all $\epsilon > 0$.

In order to obtain an estimate from above, we have to work a bit harder. Start with the expression \begin{align*} T(n) &= S(n/2) + 2S(n/4) + \dots + 2^{k-1}S(n/2^k) + 2^kT(n/2^k) \\ &\leq S(n/2) + 2S(n/4) + \dots + 2^{k-1}S(n/2^k) + 2^kS(n/2^{k-2}), \end{align*} since $T(m/4) \leq S(m)$. Now we can set up an upper bound recurrence: $$S'_k(n) = 2S'_k(n/4) + S'_k(n/8) + \dots + 2^{k-1}S'_k(n/2^{k+2}) + 2^kS'_k(n/2^k).$$ Clearly $S'_k(n) \geq S(n)$. Again using the Akra–Bazzi theorem we can estimate $S'_k(n) = \Theta_k(n^{q_k})$, where $q_k$ is a solution of $$\frac{2}{4^{p_q}} + \frac{1}{8^{p_q}} + \dots + \frac{2^{k-1}}{2^{(k+2)q_k}} + \frac{2^k}{2^{kq_k}} = 1.$$ Again we see that $q_k \to \alpha$, and so for all $\epsilon > 0$, $S(n) = O_\epsilon(n^{\alpha + \epsilon})$.

Together, upper and lower bound easily imply $\log_n S(n) \to \alpha$.