World's most popular travel blog for travel bloggers.

[Solved]: For every computable function $f$ does there exist a problem that can be solved at best in $\Theta(f(n))$ time?

, , No Comments
Problem Detail: 

For every computable function $f$ does there exist a problem that can be solved at best in $\Theta(f(n))$ time or is there a computable function $f$ such that every problem that can be solved in $O(f(n))$ can also be solved in $o(f(n))$ time?

This question popped into my head yesterday. I've been thinking about it for a bit now, but can't figure it out. I don't really know how I'd google for this, so I'm asking here. Here's what I've come up with:

My first thought was that the answer is yes: For every computable function $f$ the problem "Output $f(n)$ dots" (or create a string with $f(n)$ dots or whatever) can obviously not be solved in $o(f(n))$ time. So we only need to show that it can be solved in $O(f(n))$ time. No problem, just take the following pseudo code:

x = f(n) for i from 1 to x:     output(".") 

Clearly that algorithm solves the stated problem. And it's runtime is obviously in $\Theta(f(n))$, so problem solved. That was easy, right? Except no, it isn't because you have to consider the cost of the first line. The above algorithm's runtime is only in $\Theta(f(n))$ if the time needed to calculate $f(n)$ is in $O(f(n))$. Clearly that's not true for all functions1.

So this approach didn't get me anywhere. I'd be grateful for anyone pointing me in the right direction to figure this out properly.


1 Consider for example the function $p(n) = \cases{1 & \text{if $n$ is prime} \\ 2 & \text{otherwise}}$. Clearly $O(p(n)) = O(1)$, but there is no algorithm that calculates $p$ in $O(1)$ time.

Asked By : sepp2k

Answered By : Alex ten Brink

By the Gap theorem (using the formulation from here, search for 'gap'), for any computable unbounded function $g : \mathbb{N} \rightarrow \mathbb{N}$, there exists some increasing (in fact, arbitrarily large) computable function $f : \mathbb{N} \rightarrow \mathbb{N}$ such that $DTIME(f(n)) = DTIME(g(f(n))$.

This answers your question in that there exists such an $f$ (infinitely many, in fact): for every computable function $g$ such that $g = o(n)$, there exists some increasing function $f$ such that all problems solvable in $O(f(n))$ time are also solvable in $O(g(f(n)) = o(f(n))$ time. Note that $f$ is not necessarily time-constructible - for the time-constructible case, see the answer by @RanG.

In the Wikipedia formulation (which requires that $g(x) \geq x$), then $g \circ f$ becomes your example, and $f$ needs to be $\omega(n)$ (so you go the other way around - 'problems solvable in $O(g(f(n))$ are also solvable in $O(g(n))$' is the interesting part).

The Wikipedia article does not note that $f$ is increasing and can in fact be arbitrarily large ($f(n) \geq g(n)$ for instance). The article that proves the gap theorem does mention and prove this (see here, for example).

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/1138

0 comments:

Post a Comment

Let us know your responses and feedback