I am reading a paper and it uses the expression "polynomial delay" which I don't understand. It is used in conjonction with the big O notation, which I'm familiar with.
Here is a example sentence showing how it is used:
The loop is executed in time $O(n^2(e+n))$, which makes a polynomial delay for the output of the data.
I've tried searching a bit but I can only seem to find other papers and no definition.
Asked By : Martin Lavoie
Answered By : mdxn
The term "polynomial delay", in this case, appears to refer to the time complexity of generating (or enumerating) subsequent solutions. This is a bit different than our standard notion of runtime, since we usually only consider single executions of a machine or algorithm on an input. The delay complexity measure deals with time spent between enumerations through a set (e.g. a set of solutions).
Suppose you are working with a large dataset or database. You might want to step through or enumerate solutions to some problem using that data as an input. There are at least two steps involved here:
- Initialization (perhaps generating your first solution)
- Computing the next solution for the purpose of enumeration.
For example, you might use a polynomial space or exponential time algorithm to initialize and preprocess your data. To find or generate multiple solutions to a problem, you may only need to do a little bit of extra work. The time complexity with generating that next solution is the time delay complexity measure that you encountered in that paper. In particular, the time delay between enumerations is polynomial in respect to the input.
Jan Ramon mentions this and other time complexity measures in his talk on "General Graph Refinement with Polynomial Delay": http://videolectures.net/mlg07_ramon_ggr/
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/14614
0 comments:
Post a Comment
Let us know your responses and feedback