World's most popular travel blog for travel bloggers.

[Solved]: Suggest a data-structure that supports the following operations with time complexity of $ O(log(n)) $

, , No Comments
Problem Detail: 

Iv'e been struggling a lot with this one. I am looking for a data-structure (could be a modification of an existing type of data-structure, or a combination of more than one data-structure), which supports the following methods, all with time complexity (at worst-case) of $O(log(n))$:

  1. Insert(v) - insert v to the structure
  2. search(v) - search for v
  3. delete(v) - delete node with value v
  4. updateKeys(x,v) - for each node with a value greater then x, add v to it's value, where v is a positive number.

For example, if I had the following set of elements: ${1,2,5,8,10,15,20}$ , using updateKeys the following way: $ updateKeys(10,5) $ would change the set to : $ {1,2,5,8,15,20,25} $.

Well, I had a lot in my mind. Basically, my assignment is on the following data-structures:

  1. Hash tables - which sounded good at first, but I couldn't think of a way to find all elements larger than or equal to x.

  2. AVL tree - it does support the basic methods with this complexity, but in order to find which elements are larger, I need to search through all tree which can cost more than $ log(n) $. Further more, I can mess the tree up if I for instance make a left son be larger than it's parent.

  3. A combination of the two above - looked promising at first, but again, I couldn't find a way to find those elements.
  4. Heap - for some reason it sounds like a good direction since I have some sort of order, but update the values with log(n) time? seems impossible.

  5. Skip list - here I thought about using skip list since the most bottom layer is sorted in increasing order, so I thought about finding the minimum value larger than x, and then update all of it's followers from the right. Sounds good, but it is definitely not within $ O(log(n)) $, since for the following set ${1,2,3,4...10}$, and for $ x=1 $, I need to iterate 10 numbers, which is $ O(n) $

We also studied B-trees, ADT structures...

Then, almost hopeless, I started reading about data structures that weren't even taught in our course, such as range trees, BIT trees and so on.For some reason, some of them seemed suitable for this purpose, but since it's not in our syllabus, I don't think we are even allowed to use them.

Thanks in advance.

Asked By : superuser123

Answered By : Raphael

AVL tree - it does support the basic methods with this complexity, but in order to find which elements are larger, I need to search through all tree which can cost more than log(n).

Not quite. In order to change all those entries, yes; but by searching for $x$ you can identify all nodes that have greater values: all nodes you visit and have key $> x$, namely those where you go left, and their right subtrees.

Further more, I can mess the tree up if I for instance make a left son be larger than it's parent.

No, and that may be the key insight. If we increase all keys $>x$ by some constant we maintain the search-tree property! In particular, you never increase a left son and not its parent!

These two observations lead us to the idea: use AVL-trees (or any balanced search tree, I daresay) and execute key increments lazily. For that, add to every node an integer inc that is initially $0$.

Now, a call updateKeys(x,v) proceeds as follows.

  1. Search for x using usual BST search.
  2. Whenever you go left at some node y,

    1. increase y.key by v and
    2. increase y.right.inc by v.

This adds at most constant overhead per visited node, so it has the same $\Theta$-running-time as search.

We execute the updates whenever we traverse the tree anyway. That is, augment all four operations so that when they access some y.key they execute this first (if y.inc > 0):

y.left.inc  += y.inc y.right.inc += y.inc y.key       += y.inc y.inc        = 0 

This adds at most a constant amount of effort per visited node for all the operations, so it does not change their $\Theta$-bounds on time. We do need a linear amount of extra memory, but that seems to be allowed.

Detailed pseudocode and proofs of correctness are good exercises.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/57910

3.2K people like this

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback