World's most popular travel blog for travel bloggers.

[Solved]: Size of the instance of a problem

, , No Comments
Problem Detail: 

I am unable to find a clear definition of term "size of the instance" when considering some algorithmic problem. I came accross a definition that said, that "Size of the instance corresponds to the size of the input array of numbers (in case the input is numeric) or size of the representation of the input."

Is the size of the instance simply size of the input? Or is it size of all the structures used in the algorithm (in that case, it would be the space complexity, right?).

Lets explain it on a specific problem:

a) I have an array of n numbers and I have to sort them in non-decreasing order.

  • So the size of the instance is n (those n numbers I have to sort - using for example BubbleSort). Is the size different for some other sort that uses some additional arrays during the sorting process?

b) Find strong components of a graph with n vertices and m edges.

  • Is it simply n+m? Or n+2m (if I represent edges by 2*m array)? Or do I have to consider all structures used in my algorithm (in this case it would be Tarjan, I guess).

Thank you for clarification!

Asked By : Smajl

Answered By : Yuval Filmus

Unfortunately, there is no single definition of the size of instance. But if there were, it would be the size of the input in bits. Whenever we talk about Turing machines, we have to slightly change this to the size of the input in tape symbols.

Things get more complicated for models like the RAM machine, in which numbers are opaque. If the input consists of arbitrary-precision numbers, then the inputs size is the number of input numbers. This is again a variation on the earlier definition, since in this model, every memory cell contains an entire number.

Summarizing, the size of the input is always the number of cell / tape positions containing the input, but the interpretation of this quantity depends on the model.

More concretely, in your first example, the input includes the input array of length $n$ as well as the number $n$, so the input length should be at least $n+1$. The model in question here is the RAM model. If it were the Turing machine model, the size of the input would depend on the size of the integers being sorted.

In your second example, it depends on the exact encoding. In the RAM model, you can encode a graph with $n$ vertices and $m$ edge using $O(m)$ cells. In the Turing machine model, you can encode a graph using $O(n^2)$ cells or $O((m+1)\log n)$ cells, using the dense and sparse encodings, respectively. Sometimes the input encoding affects the running time, and in that case the algorithm would expect the input in some specific encoding, and you would get the input length from this encoding. Notice that I don't bother calculating the exact input length – almost always asymptotics are enough.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/26268

0 comments:

Post a Comment

Let us know your responses and feedback