**Problem Detail:**

I have a deterministic function $f(x_1, x_2, ..., x_n)$ that takes $n$ arguments.

Given a set of arguments $X = (x_i)$, I can compute $U_X = \{ i \in [1, n] : x_i \text{ was read during the evaluation of } f(X) \}$

Would it be valid to use the set $K_X = \{(i, x_i): i \in U_X\}$ as a memoization key for $f(X)$?

In particular, I am worried that there may exist $X=(x_i)$ and $Y=(y_i)$ such that:

$$ \tag1 U_X \subset U_Y $$ $$ \tag2 \forall i \in U_X, x_i = y_i $$ $$ \tag3 f(X) \neq f(Y) $$

In my case, the consequence of the existence of such $X$ and $Y$ would be that $K_X$ would be used as a memoization key for $f(Y)$, and would thus return the wrong result.

My intuition says that, with $f$ being a deterministic function of its arguments, there should not even exist $X$ and $Y$ (with $U_X$ a strict subset of $U_Y$) such that both $(1)$ and $(2)$ hold (much less all three!), but I would like a demonstration of it (and, if it turns out to be trivial, at least pointers to the formalism that makes it trivial).

###### Asked By : Jean Hominal

###### Answered By : D.W.

As long as $f$ is deterministic and depends only on its arguments (not on other global variables), yes, you can safely use that as a memoization key. The bad case cannot happen.

You can see this by thinking about what happens as it reads the $t$th argument and using induction on $t$. Let $i_1,i_2,i_3,\dots,i_t$ be the sequence of argument indices read (i.e., $f$ first reads $i_1$ when running on input $x$, then reads $i_2$, etc.). For each $t$, if $f(x)$ and $f(y)$ have behaved the same up until just before they read the $t$th of these, then the next index examined by $f(x)$ and $f(y)$ will be the same, say $i_t$; if additionally $x_{i_t} = y_{i_t}$, then it follows $f(x)$ and $f(y)$ will behave the same up until just before they read the $t+1$st of their arguments. Since $i_t \in U_X$, $x_{i_t} = y_{i_t}$ is guaranteed by your assumptions. Now use induction.

This assumes everything is deterministic. For instance, $f$ had better not be allowed to read a random-number generator, the current time of day, or other global variables. Beware that on some platforms there can be surprising sources of non-determinism, e.g., due to aspects of behavior that are unspecified and might depend on external factors. For instance, I have a recollection that floating-point math in Java can introduce non-determinism in some cases, and so if you want to write code that is verifiably deterministic, you probably want to avoid all floating-point math.

Question Source : http://cs.stackexchange.com/questions/53309

**3200 people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback