**Problem Detail:**

*This is a homework. I'd appreciate if you didn't give away answer straightaway but instead pointed me to the right direction.*

From huge majority of sources the definition of $\mathcal{O}(n)$ is:

$f, g : N \to R^+$

$f \in \mathcal{O}(g) \Leftrightarrow \exists{c,n_0} \in N \> \forall n > n_0 : f(n) \leq c\cdot g(n)$

but in Papadimitriou's book there is a different definition I haven't seen anywhere before:

The

order of$f$, denoted $\mathcal{O}(f)$, is the set of all functions $g:N\rightarrow N$ with the property that there are positive natural numbers $c>0$ and $d>0$ such that, for all $n\in N$, $g(n) \leq c \cdot f(n)+d.$

I assume the first definition evaluates the functions as continuous functions whereas the second definition is concerned only about the points on $n \in N$. Therefore undefined values that asymptotically goes to $\pm\infty$ are not an issue anymore.

For example if we take $f(n)=\frac{1}{n}$ and $g(n)=1$. We know that $f(n)\in g(n)$ because $\lim_{x\to\infty}\frac{\frac{1}{x}}{1}=0$. We find the value for $f(1)$ and raise the $g(n)$ above this point such as: $f(n) \leq 6\cdot f(n)$ or $f(n) \leq f(n)+6$

Is my assumption correct? If it is how can I prove this equivalence?

###### Asked By : Rudimentary Joe

#### Answered By : Gilles

The two definitions are not equivalent. However, the case where they aren't equivalent is one that we aren't interested in for algorithm analysis.

The standard definition implies the Papadimitriou definition. The proof is fairly straightforward, by following the definitions carefully.

Suppose that $\exists c, n_0 \in \mathbb{N}, \forall n \gt n_0, f(n) \le c \cdot g(n)$. Let $c$ and $n_0$ be such constants. We want to prove that there exist $c',d' \in \mathbb{N}^*$ such that $\forall n \in \mathbb{N}, f(n) \le c' \cdot g(n) + d'$. Take $c' = c$ and $d = \max\{f(x) \mid x \le n_0\}$. If $n \le n_0$ then $f(n) \le d'$; if $n \gt n_0$ then $f(n) \le c \cdot g(n)$. Either way, for any $n \in \mathbb{N}$, $f(n) \le c' \cdot g(n) + d'$.

The converse is not true. Again, unfolding the definition works up to a point, but then one little extra assumption is needed to make the proof go through.

Suppose that we have $c', d'$ such that $\forall n \in \mathbb{N}, f(n) \le c' \cdot g(n) + d'$. We want to find $c$ and $n_0$ such that $\forall n \ge n_0, f(n) \le c \cdot g(n)$. If there is an $n_0$ such that $\forall n \ge n_0, g(n) \ge 1$, then we can factor $g(n)$: for any $n \ge n_0$, $f(n) \le (c' + d'/g(n)) g(n) \le (c' + d') g(n)$, so we can take $c = c' + d'$.

The case where this proof fails is when there is no $n_0$ such that $\forall n \ge n_0, g(n) \ge 1$, i.e. when there are infinitely many integers $x$ such that $g(x) = 0$. (More generally, for real-valued functions, the problem case is when there is a set of values $(x_k)_{k\in\mathbb{N}}$ such that $\lim_{k\to\infty} x_k = \infty$ and $\lim_{k\to\infty} g(x_k) = 0$. Equivalently, that means that $g$ has no positive infimum.) Under the Papadimitriou definition as quoted here, the order of such a function includes all constants, but under the standard definition, there has to be a cutoff point above which $f$ is zero when $g$ is zero. For example, take the function $g : \mathbb{N} \to \mathbb{N}$ defined by $g(x) = 1$ if $x$ is odd and $g(x) = 0$ if $x$ is even. The function $f(x) = 1$ does not satisfy $f \in O(g)$ under the standard definition, but it does under the Papadimitriou definition.

This corner case isn't important in algorithm analysis because complexity functions are never "really" zero. With space complexity, there's always a bit of stack space just for the function call. With time complexity, it takes a few instructions to call a function even if the function does nothing.

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/34064

**3.2K people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback