**Problem Detail:**

I am trying to wrap my head around these asymptotic notations. Given $f(n)$ and $g(n)$, one can write $f(n) = \Omega(g(n))$ as shorthand for $f(n) \geq c*g(n), n\geq n_0$. But what happens when $n<n_0$. What can we then say about $f(n)$ and $g(n)$? Do we just assume that $f(n)$ is a constant and the relation still works? Or does the question not even make sense to ask?

###### Asked By : Mads

###### Answered By : Raphael

Nothing. If all you know about two functions is their $O$/$\Omega$ relations, you know *nothing* about how they relate to each other on *any* finite prefix.

Yes, this makes asymptotics of, say, runtime functions absolutely useless in terms of questions like, "which algorithm should I use in scenario X?" On first sight, that is.

Luckily, we usually *do* analyse with more detail but throw them away when formulating the theorem with coarse notation like Landau symbols. Many results about "practical" algorithms implicitly say, "and this behaviour can be observed for reasonable $n$".

Sometimes you get better theorems which use Landau notation for "error terms" only; authors like Knuth do not say "algorithm A runs in time $O(n \log n)$", but rather "algorithm A takes $2n\log n - 2n + O(1)$ comparisons". This does not completely do away with your issue, but mitigates it to some extent. If you know that that $O(1)$ stands for $\leq 1377$, you can make informed decisions based on that result.

Some of our reference material discusses facets of this issue.

Question Source : http://cs.stackexchange.com/questions/43613

**3200 people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback