I've heard the motto interaction is more powerful than algorithms from Peter Wegner. The basis of the idea is that a (classical) Turing Machine cannot handle interaction, that is, communication (input/output) with the outside world/environment.
How can this be so? How can something be more powerful than a Turing Machine? What is the essence of this story? Why is it not more well known?
Asked By : Dave Clarke
Answered By : Carl Mummert
Turing machines can handle interaction just fine, using oracle tapes. It works are follows: from the point of view of a computer that handles interaction, the input of the operator is simply another sequence of bits that it sent into the computer from time to time. We all know that a lazy sysadmin could write a script to send the input to the program when it is requested, so that the sysadmin could go on break earlier. The interaction machine has no way to know if there is really a live operator at the console, or if the input is being piped from another program.
Having all the interaction input prepared ahead of time is the same, in theoretical terms, as having all the input on a separate tape that is used by an oracle Turing machine. Whenever the computer would normally request interaction from the operator, the machine instead reads from the input tape. If the thing read from the tape seems invalid in some way, the Turing machine does exactly what the interaction machine would do on receiving that input.
I would guess that Wagner is aware of the ability to use oracle tapes to code input, so you have to take his comments with a grain of salt, or you have to ask what he actually means. I believe that people who think about interaction are generally worried about two things:
"Real" computers do have interaction, but algorithms as defined by Turing don't. We can get around this by coding the input on oracle tapes, but this still does not match the way that real computers operate. It might be nice to study models of computation that are more closely aligned with real computers.
Oracle-based algorithms are not often considered in day-to-day computing because normal computers don't come with a magic "oracle" to supply data. But we might be able to actually just use a person as the oracle. If the person understands the data that is being requested, they may even be able to help the algorithm along, thus improving its performance. In other words a human may be able to provide a useful oracle tape rather than simply a random one, and in principle this might lead to faster or more powerful computing methods compared to non-oracle-based ones. A similar thing happens with randomized computing, after all, where the machine is given a sequence of random bits as an extra input.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/438
0 comments:
Post a Comment
Let us know your responses and feedback