Are there any existing efficient algorithms for lazily computing a random permutation of the positive integers in a given range (e.g. the range offered by an unsigned integer type in a CPU)?
What I mean is this:
An algorithm that, given a source of randomness and a positive integer range (for simplicity it could be restricted to starting at 0 and ending at n), can be asked to produce the next value in that range. This can be repeated n times, and a different number is provided by the algorithm each time.
If the caller were to push all of the values retrieved from the algorithm into a queue the resulting ordering would be just as random as if the result had been generated with the Fisher–Yates shuffle.
The naïve implementation would be to pick a random number in the range, check to see if it has already been selected, and choose again if so. This algorithm starts efficient enough, but has increasing complexity for subsequent calls -- too expensive for real-world applications. To really be useful it would have to be O(1) for each trial, though a good logarithmic time might be applicable.
It has to be lazy because with large ranges (such as a 64 bit unsigned integer) it would take Exabytes of storage to precompute the whole permutation, and it is unlikely that a real-world application is going to actually need all of the range; merely it needs that the entire range was considered in values it is actually going to use.
Essentially the problem is just lazily generating a mapping between the integers in a range and a random permutation of those same integers, though being able to access the mapping at arbitrary points instead of just sequentially is but a bonus.
Asked By : Techrocket9
Answered By : D.W.
Yes. Use a block cipher with a small block size. This is extremely efficient and requires extremely little state.
A block cipher is a map $E : K \times X \to X$ such that, for every $k \in K$, $E(k,\cdot)$ is a permutation on $X$ (a bijective function $X\to X$). Moreover, if $k$ is chosen uniformly at random, then $E(k,\cdot)$ acts in a way that is computationally indistinguishable from a function chosen uniformly at random from all permutations on $X$.
Thus, a simple solution is to let $X$ be the set of possible values of an unsigned integer, e.g., $X=\{0,1,2,\dots,2^{64}-1\}$, let $x$ denote a counter ($x \in X$) and $k$ a random key. The state of your pseudorandom generator is $x$ and $k$. When you want a new random number, you output $E(k,x)$ and then increment $x$. This requires very little memory, is extremely efficient, and is provably good (if the block cipher is secure, which is a reasonable assumption given the state of modern cryptography).
You can find information on how to build a block cipher with a small block size here:
- Format Preserving Encryption (on Wikipedia)
- Can you create a strong blockcipher with small blocksize, given a strong blockcipher of conventional blocksize? (on crypto.SE)
- Is it possible to generate a secure permutation F over 32-bit integers even if F(0) … F(n) is public knowledge? (on crypto.SE)
- Can I create a fixed length output from a fixed length input? (on crypto.SE)
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/29822
0 comments:
Post a Comment
Let us know your responses and feedback