Suppose $A$ is a sorted array with $n$ elements. I want to know whether we can determine if there are majority elements in $A$ with time complexity $O(1)$.
Recall that a majority element of $A$ is an element which appears over $n/2$ times in $A$.
I can find a $O(\log(n))$ algorithm to solve this problem. But I'm not sure whether this can be done in $O(1)$. I was trying to use the adversary method, but I don't know how to construct it.
Asked By : Paul
Answered By : R B
Any algorithm would need $\Omega(\log n)$ queries.
To see this, define $f(k)$ to be the number of queries needed for deciding whether an element $x$ appears at least $a$ times in a sorted array $A$. We assume that $x$ appears in $A[m],A[m+1],\dots,A[M]$, and that $k\triangleq\min\{a, m-1, n-M\}$.
Notice that in these notations we are looking to bound $f(\lfloor n/2 \rfloor)$.
Claim: $f(k)\ge \log k$.
Proof:
Consider the first query made by the algorithm.
If it was done within radius $k/2$ of the interval (i.e. somewhere in $[m-k/2,\ldots,M+k/2]$), and found the median at the spot, then we still need at least $f(k/2)$ queries to decide the problem.
If it was done outside that radius, consider the case where the queried cell did not contain the median. Once again, this leaves us with at least $f(k/2)$ queries to be made.
Continue with induction and you get the $\Omega(\log n)$ bound.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/41072
0 comments:
Post a Comment
Let us know your responses and feedback