World's most popular travel blog for travel bloggers.

[Solved]: Sort algorithm input probabilities

, , No Comments
Problem Detail: 

Suppose that there is an algorithm which sorts a sequence of $n$ elements

$$a_1, a_2, ..., a_n$$

Each of the $a_i$ is chosen with probability $1/k$ from a set of $k$ distinct integer numbers.

Is it true, given that $k \to \infty$, that:

  1. The probability that any two of incoming sequence elements are equal, tends to $0$?
  2. The probability that the incoming sequence is already sorted, tends to $\frac{1}{n!}$?

Why / why not?

Asked By : wh1t3cat1k

Answered By : Yuval Filmus

Here is why you would expect these properties to hold. Suppose that $a_1,\ldots,a_n$ are chosen independently from the uniform distribution on $[0,1]$. The event $a_i = a_j$ has probability zero, and so with probability $1$ all numbers are distinct. Moreover, given that, all sequence orderings are equally likely, and since there are $n!$ of them, the probability that the sequence is ordered is exactly $1/n!$.

To prove that these claims hold in the limit even when the $a_i$ are sampled from a finite set requires some calculation, which I leave to you.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/14198

0 comments:

Post a Comment

Let us know your responses and feedback