In Hromkovič's Algorithmics for Hard Problems (2nd edition) there is this theorem (2.3.3.3, page 117):
There is a (decidable) decision problem $P$ such that for every algorithm $A$ that solves $P$ there is another algorithm $A'$ that also solves $P$ and additionally fulfills
$\qquad \forall^\infty n \in \mathbb{N}. \mathrm{Time}_{A'}(n) = \log_2 \mathrm{Time}_A(n)$
$\mathrm{Time}_A(n)$ is the worst-case runtime of $A$ on inputs of size $n$ and $\forall^\infty$ means "for all but finitely many".
A proof is not given and we have no idea how to go about this; it is quite counter-intuitive, actually. How can the theorem be proven?
Asked By : Raphael
Answered By : Kaveh
It seems to be a simple case of Blum's Speedup Theorem:
Given a Blum complexity measure $(\varphi, \Phi)$ and a total computable function $f$ with two parameters, then there exists a total computable predicate $g$ (a Boolean valued computable function) so that for every program $i$ for $g$, there exists a program $j$ for $g$ so that for almost all $x$ $$f(x,\Phi_j(x)) \leq \Phi_i(x)$$
Just let the the complexity measure be the time complexity measure (i.e. $\Phi_e(x)$ is the time complexity of the Turing machine with code $e$) and let $f(x,y) = 2^y$.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/310
0 comments:
Post a Comment
Let us know your responses and feedback