World's most popular travel blog for travel bloggers.

[Solved]: Is Big-Oh notation preserved under monotonic functions?

, , No Comments
Problem Detail: 

I was just looking at the big-Oh notation. I wanted to know if the following is true in general $$f(n)=O(g(n)) \implies \log (f(n)) = O(\log (g(n)))$$

I can prove that this is true if $g$ is monotonically increasing, but am not sure if this holds in general.

Asked By : Devil

Answered By : Ran G.

By definition, $$ f(n) \in O(g(n))$$ means there exists some positive constant $c$, such that for any large enough $n$, $$ |f(n)| \le c |g(n)|$$ or equivalently, $\lim_{n\to\infty} \frac{f(n)}{g(n)} \le c < \infty$

Taking $\log$ of both sides, for any large enough $n$ it holds that $$ \log (f(n)) \le log (c) + \log(g(n)) $$

so, $$\frac{\log f(n)}{\log g(n)} \le \frac{\log c}{\log g(n)}+ c$$

Now can this go to infinity? only if $\log(g(n)) \to 0$

So let's try, and take $f(n)=2^{-n}$ and $g(n)=1/n$. Obviously $f(n) \in O(g(n))$. Yet, taking the logs we have $\log f(n) = -n$, $\log g(n) =-\log n$, and it is clear that no constant $c'$ will satisfy $|\log f(n)| \le c' |\log g(n)|$ for large enough $n$. Yet, I always find it weird to discuss $O$ of negative functions.


EDIT:

Here's example w/o negative function. Get back to the requirement of $\log(g(n)) \to 0$. Take $g(n) = 1+1/n$. Thus $\log (g(n)) \to 0$.
(actually, the above example of $2^{-n}$ vs $1/n$ satisfies $g(n)\to 0$ rather than $log(g(n))\to 0$. Oops).

If we take $f(n) = 2 + 1/n$ then $\log f(n) \to 1$ (binary log, obviously).

so while $f(n) \le 3g(n)$ for every $n\ge 1$, it is not true that $\log f(n) \le c' \log g(n)$ for all large enough $n$, as the right-hand side approaches zero, while the left-hand side approaches 1.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/28068

 Ask a Question

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback