My lecturer for Algorithms said that most of the data structures I will encounter in the algorithms course I am taking have a basic action which is of O(1).
Ex: Binary heap.
Basic action is:
- Compare 2 childen.
- Compare the "winner" with his parent.
- Replace when needed.
- Do 1-3 with the "winner", until and including the root.
How is O(1) even possible?
Asked By : Trinarics
Answered By : trutheality
What was meant is that every data structure is designed to do something really well, e.g:
- For an array, lookup by index is O(1)
- For a linked list, insertion at the head is O(1)
- For a heap, finding the smallest element is O(1)
... and so on. This doesn't mean that the operation that is O(1) for one data structure wouldn't have much worse complexity for some other data structure.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/21985
0 comments:
Post a Comment
Let us know your responses and feedback