World's most popular travel blog for travel bloggers.

[Solved]: Variants of the 3-SUM problem

, , No Comments
Problem Detail: 

The 3SUM problem has two variants. In one variant, there is a single array $S$ of integers, and we have to find three different elements $a,b,c \in S$ such that $a+b+c=0$. In another variant, there are are three arrays $X,Y,Z$, and we have to find a number per array $a\in X, b\in Y, c\in Z$ such that $a+b+c=0$. Call the first variant 3SUMx1 and the second one 3SUMx3.

Given an oracle for 3SUMx3, can we solve 3SUMx1 in linear time?

One option that comes to mind is just to create 3 replicates of the input array $S$ and run: 3SUMx3($S,S,S$). However, this may return two copies of the same item. E.g. if $S$ has only two elements, 1 and -2, then the 3SUMx1 problem has no solution, but 3SUMx3($S,S,S$) will return the fake solution $(1,1,-2)$.

I currently have a solution but it is not linear. To explain the solution, consider first the simpler problems 2SUMx1 and 2SUMx2, in which we only look for two numbers $a+b=0$ in either one array or two different arrays. Given an oracle for 2SUMx2, the problem 2SUMx1 can be solved in the following way.

Let $n$ be the number of elements in the input array $S$.

For i = 1 to log(n):   Partition the array S to two arrays, X and Y, based on bit i of the index. I.e.:     X contains all elements S[k] where bit i of k is 0;     Y contains all elements S[k] where bit i of k is 1.   Run 2SUMx2(X,Y)   If there is a solution, return it and exit. If no solution were found for any i, return "no solution". 

This algorithm works because, if there are two different items, $a,b\in S$, whose sum is 0, then their index must have at least one bit different, say bit $i$. So, in iteration $i$ they will be in different parts and will be found by 2SUMx2($X$,$Y$). The time complexity is $O(n \log n)$.

This algorithm can be generalized to 3SUM by using trits instead of bits, and checking all possible pairs of them. The time complexity is therefore $O(n \log^2 n)$.

Is there a linear-time algorithm?

Asked By : Erel Segal-Halevi

Answered By : D.W.

Randomized algorithms

If you'll accept a randomized algorithm, yes, it can be done in linear time. There's a randomized algorithm whose expected running time is $O(n)$, and where the probability that it takes longer than $c \cdot n$ time is exponentially small in $c$.

Here's the idea. Randomly permute the entries of $S$, then split it into three equal-sized pieces, each of length $n/3$: let $X$ be the first one-third, $Y$ the second one-third, and $Z$ the third one-third. Now call 3SUMx1(X, Y, Z). If this finds a solution, you have a solution to the 3SUMx3 problem. Moreover, I claim that if there is a solution to the 3SUMx3 problem, then with probability at least $6/27$, this procedure finds a solution. Therefore, the expected number of times you have to repeat it (before finding the first solution) is constant.

The running time of each iteration is $O(n)$, and you do a constant number of iterations (on average), so the total expected running time is $O(n)$. The probability that you fail to find a solution after $k$ iterations is $(21/27)^k$, which is exponentially small in $k$.

Where does $6/27$ come from? Suppose there's a solution to the 3SUMx3 problem, say, $S[i]+S[j]+S[k]=0$. What is the probability that $i,j,k$ all come from different thirds of the array? This probability is $3!/3^3 = 6/27$. So, if a solution to the 3SUMx3 problem exists, then each iteration of the above procedure has at least a probability $6/27$ of finding it.

Deterministic algorithms

What about a deterministic algorithm? I don't have a specific proposal in mind, but I feel like it's plausible that this construction could be derandomized.

For instance, to give you an idea of the sort of thing I'm talking about, suppose we split the array into thirds once (deterministically). Suppose there exists a solution $S[i]+S[j]+S[k]=0$. Then there are $3^3=27$ cases for which third each $i,j,k$ falls into. After you remove symmetries, there are only $1+6+3$ cases:

  • Type A (1 case): All three indices fall in different thirds.

  • Type B (6 cases): Two of the indices fall into the same third, and the other index falls a different third.

  • Type C (3 cases): All three indices fall into the same third.

You can test for a type-A solution in $O(n)$ time, with a single invocation of the oracle: 3SUMx1(X, Y, Z). Also, you can test for a type-C solution by making three recursive calls to the same solution (one on X, one on Y, one on Z). This leaves only the question of how to test for a type-B solution. It seems like you might also be able to test for a type-B solution using a similar recursive algorithm, though I haven't tried to work out the detailed case analysis. It seems not impossible that perhaps some ideas along these lines could lead to a deterministic algorithm -- though I haven't worked out the details and I'm just speculating at this point.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/37888

0 comments:

Post a Comment

Let us know your responses and feedback