While thinking about different calculi for predicate logic (like natural deduction and sequent calculus), I noticed that these calculi are (often) presented in a form suitable for "human computers". A "human computer" is limited to use write once read many (WORM) memory when processing large amounts of data. Pure functional programming also seems to favor a WORM memory model. In fact, it seems to me that the WORM memory model is so natural that classifying it as unconventional computing might underestimate its importance. (Understanding the strengths and limitations of the computing resources available to humans is important.)
What is known about the relation between space and time complexity for machines with WORM memory? What are the keywords to google for available material related to these questions? Do we known whether the time complexity will remain the same, if a small (for example C*log(WORM memory)^n) amount of normal memory is added?
Asked By : Thomas Klimpel
Answered By : Thomas Klimpel
What are the keywords to google for available material related to these questions?
The Wang B-machine is a very simple Turing machine with only WORM memory. A Turing machine with only WORM memory is commonly referred to as a "non-erasing Turing machine".
What is known about the relation between space and time complexity for machines with WORM memory?
Let's assume a Turing machine with $q$ internal states and a binary tape alphabet. If the machine halts after $t$ steps, having written $m$ marks on the tape and "visited" $p$ positions of tape, then $t \leq (m+1) \cdot pq$. The reasoning behind this bound is simple. As long as the machine doesn't write a new mark, there are only $pq$ distinct configurations. As the machine didn't cycle, it can have taken at most $pq$ steps between two write events, and there were only $m$ such write events.
Do we known whether the time complexity will remain the same, if a small (for example C*log(WORM memory)^n) amount of normal memory is added?
Already Wang himself showed that a non-erasing Turing machine can simulate a normal Turing machine with only a polynomial slowdown. It was shown recently that the same is also true for the Wang B-machine. Hence it's not really interesting to investigate the change of time complexity when adding a small amount of normal memory to the machine. It would be more interesting to investigate (for different algorithms) how much the normal space complexity can be reduced by allowing for a separate non-erasable space complexity. Not sure how to apply this to pure functional programming.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/18939
0 comments:
Post a Comment
Let us know your responses and feedback