World's most popular travel blog for travel bloggers.

Why %RSD of execution times, while sorting hundreds of arrays, is lower for larger arrays of random integers?

, , No Comments
Problem Detail: 

I am experimenting with the sorting of arrays and their execution times. While using bubblesort, insertsort and quicksort, I have observed that relative standard deviation (%RSD) of execution times, for five hundred arrays of the same size containing random integers, decreases if the size of the arrays increases.

For example: if the %RSD of execution times while sorting 500 arrays, of size fifty elements each, is observed as 7%, the same for arrays with five hundred elements each is about 2.5%.

So the question is: why this value of change decreases with the increase in the array size?

P.S. I am using a cycle-accurate simulator (gem5) and taking exact clock cycles.

Asked By : tod
Answered By : tod

In the case of insertsort, the complexity (time) depends on the number of swaps performed during sorting. A kth element (k=1,2,3,...,n) in an array of size n, may require 0 to k swappings.

The execution time mentioned in the question depends on the sum of all the swaps performed during the sorting of a complete array.

Considering the random initialization of the arrays, the number of swaps performed for any kth element would be a random number bounded by the given range.

Now, If we consider the average value of the number of swaps per element (sum of all the swaps divided by the number of elements in the array), then, according to the law of large numbers, as the number of elements per array will increase this average will get closer to the expected value.

i.e., the variation in, the total number of swaps per array, among different arrays, will decrease.

Hence, the change (%RSD) in the execution times will also decrease.

--to be continued--

Best Answer from StackOverflow

Question Source :

3200 people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback