World's most popular travel blog for travel bloggers.

What's Big O of $\log(n+7)^{\log(n)}$?

, , No Comments
Problem Detail: 

As part of my continual improvement plans, I'm reviewing algorithm analysis and I've hit a roadblock. I cannot figure out how to determine the Big O complexity of $\log(n+7)^{\log(n)}$. I've spent the last few days rolling this over in my head and I still haven't made any progress in solving it. Could someone tell me what the Big O of this algorithm is, and explain how you solved it?

Asked By : Bobby Vandiver

Answered By : G. Bach

If you meant what $\mathcal{O}((\log (n+7))^{\log (n)})$ is (so the power applies to the logarithm itself, not its argument), then we simplify that to $\mathcal{O}((\log (n))^{\log (n)})$; as per a suggestion by Saeed Amiri, here a proof for this equality:

Since $(\log (n))^{\log (n)}) \leq (\log (n+7))^{\log (n)})$ for all $n$, we have $\mathcal{O}((\log (n))^{\log (n)}) \subseteq \mathcal{O}((\log (n+7))^{\log (n)})$.

For the other direction, consider

$\qquad \begin{align} (\log (2n))^{\log (n)} &\leq c(\log (n))^{\log(n)} &\iff \\ \frac{\log(2n)}{\log(n)} &\leq \sqrt[\log(n)]{c} &\iff \\ 1+\frac{\log(2)}{\log(n)} &\leq \sqrt[\log(n)]{c} &\iff \\ \left(1+\frac{\log(2)}{\log(n)}\right)^{\log(n)} &\leq c \end{align} $

What we now notice is that $\log n$ is a non-integer bound, so we can't directly expand to a polynomial here. We could if we knew that

$\left(1+\frac{\log(2)}{\log(n)}\right)^{\log(n)} \leq \left(1+\frac{\log(2)}{\lceil\log(n)\rceil}\right)^{\lceil\log(n)\rceil}$

Therefore, a small proof of the monotonicity of that function:

$\frac{d}{dx}\left(1+\frac{\log(2)}{\log(x)}\right)^{\log(x)} = \frac{\left(\frac{\log (2x)}{\log (x)}\right)^{\log (x)}\left(\log (2x)\log\left(\frac{\log (2x)}{\log (x)}\right) - \log (2)\right)}{x \log (2x)} \geq \frac{\log (2x)\log\left(\frac{\log (2x)}{\log (x)}\right) - \log (2)}{x (\log (x))^{\log (x)}} > 0$

for all $x > x_0$ for some $x_0$ [specifically, for $x_0 = 2$; check the link to math in the comment section].

Since we now know that the function is monotonically nondecreasing for all $n > 2$, we can choose our $c$ slightly differently:

\begin{align} \left(1+\frac{\log(2)}{\log(n)}\right)^{\log(n)} \leq \left(1+\frac{\log(2)}{\lceil\log(n)\rceil}\right)^{\lceil\log(n)\rceil} &\leq c &\overset{\text{rewriting the polynomial}}{\iff} \\ \sum_{i=0}^{\lceil\log(n)\rceil}\binom{\lceil\log(n)\rceil}{i}\left(\frac{\log(2)}{\lceil\log(n)\rceil}\right)^i &\leq c \end{align}

Now we use the inequality $\binom{n}{k} \leq \frac{n^k}{k!}$ and see that \begin{align} \sum_{i=0}^{\lceil\log(n)\rceil}\binom{\lceil\log(n)\rceil}{i}\left(\frac{\log(2)}{\lceil\log(n)\rceil}\right)^i &\leq \sum_{i=0}^{\lceil\log(n)\rceil}\frac{(\lceil\log(n)\rceil)^i}{i!}\left(\frac{\log(2)}{\lceil\log(n)\rceil}\right)^i \\ &= \sum_{i=0}^{\lceil\log(n)\rceil}\frac{(\log(2))^i}{i!} \\ &\leq \sum_{i=0}^{\infty}\frac{(\log(2))^i}{i!} \\ &= e^{\log(2)} \end{align}

So we know that $\mathcal{O}((\log (n+7))^{\log (n)}) \subseteq \mathcal{O}((\log (n))^{\log (n)})$ since we can choose a finite $c$ in the above inequation.


If we try to express $(\log (n))^{\log (n)}$ as a power of $n$ now, we get

\begin{align} n^{f(n)} &= (\log (n))^{\log(n)} &\iff \\ \log(n^{f(n)}) &= \log ((\log (n))^{\log(n)}) &\iff \\ f(n) \cdot \log(n) &= \log(n) \cdot \log (\log (n)) &\iff \\ f(n) &= \log (\log (n)) \end{align}

, and so we get $\mathcal{O}((\log (n))^{\log (n)}) = \mathcal{O}(n^{\log(\log(n))})$, if that is any improvement.

(Credit where it's due: I copied this way of expressing our function as a power of $n$ from here, so google can help with this stuff, sometimes.)


Now where does this place $\mathcal{O}((\log n)^{\log n})$ in terms of complexity? We know that $\forall c \in \mathbb{R}: \lim_{n \to \infty} \frac{c}{\log \log n} = 0$, which means that $n^c \in \mathcal{o}((\log n)^{\log n}) = \mathcal{o}(n^{\log \log n})$; the small $\mathcal{o}$ is intended here, it means that $n^c$ grows strictly slower than $n^{\log \log n}$. [A small remark about that at the end of all this; if this is not relevant to your needs or confuses you, you can - apart from those in the remark at the end itself - replace every $\mathcal{o}$ with $\mathcal{O}$, all statements will still be true.]

On the other hand, we have for any constant base $b > 1$:

\begin{align} n^{\log \log n} &\leq b^n &\iff \\ (\log \log n) * (\log n) &\leq n * \log b \end{align}

This tells us that instead of $n^{\log \log n}$ and $b^n$, we can equivalently compare $(\log \log n) * (\log n)$ and $n$ by just ignoring the constant (!) factor $\log b$. Using the limit rules for asymptotic notation as well as l'Hôpital's rule, we get

\begin{align} \lim_{n \to \infty} \frac{(\log \log n) * (\log n)}{n} &\leq \lim_{n \to \infty} \frac{(\log n) * (\log n)}{n} \\ &\overset{\text{l'Hôpital}}{=} \lim_{n \to \infty} \frac{2*\log n}{n} \\ &\overset{\text{l'Hôpital}}{=} \lim_{n \to \infty} \frac{2}{n} = 0 \end{align}

Therefore we know $(\log n)^{\log n} = n^{\log \log n} \in \mathcal{o}(b^n)$ for any base $b>1$ of the exponential, meaning it grows strictly slower than any exponential function with constant base that we would be interested in. (What I mean by this is that in the analysis of the complexity of algorithms, we don't care about exponentials with negative bases because those aren't even real functions, much less functions over the naturals, or exponentials with bases between $0$ and $1$ because those all converge to $0$.)

In summary, we now know that in terms of complexity:

\begin{align} \mathcal{O}(n^c) \subsetneq \mathcal{O}((\log n)^{\log n}) = \mathcal{O}(n^{\log \log n}) \subsetneq \mathcal{O}(b^n) \end{align}

for any constants $c > 0$ and $b > 1$. What you have is a function strictly between polynomial and exponential growth.


Now the remark about Big-Oh and small-oh: $\mathcal{o}$ is a stronger statement than $\mathcal{O}$, because if $f(n) \in \mathcal{o}(g(n))$ then we know $g(n)$ grows strictly (meaning by a non-constant factor) faster than $f(n)$, but if $f(n) \in \mathcal{O}(g(n))$ we only know that $g(n)$ grows at least as fast as $f(n)$, so there may only be a constant factor. For example, $n \in \mathcal{o}(n \log n)$ and $n \in \mathcal{O}(n)$, but $n \not \in \mathcal{o}(n)$.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/13912

0 comments:

Post a Comment

Let us know your responses and feedback