**Problem Detail:**

Possible Duplicate:

How can we assume comparison, addition, … between numbers is $O(1)$

When we calculate the time-complexity of some algorithm we make many simplifications (or assumptions) on our calculation. For instance we say that assume basic arithmetic operations are all constant time and we assume reading/writing to a memory location is constant time.

I've worked on some algorithms where these assumptions don't hold. I'm looking for the background I need to properly approach/describe these algorithms. In particular I wish to know:

- What is the generic name for such a simplification/assumption?
- Is there a name for the basic model of simplifications we're using? (Assuming there is a common model)

*Note: I've done work with cache-dependent/cache-oblivious algorithms. I understand how to calculate in that domain, but I'm lacking the terminology/background to properly compare/contrast such algorithm analyis with that which don't consider the cache.*

###### Asked By : edA-qa mort-ora-y

#### Answered By : Arani

It is generally referred to as "Random Access Machine" or RAM model. This model assumes that accessing any unit of data, irrespective of its position or type, takes the same constant amount of time. Now, what you take as a "unit" of data is important. For all practical purposes, we may consider the primitive data types, integer, float and characters (but not strings) as a unit of data. A single instruction on unit data is considered to take constant amount of time.

###### Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/3327

**3.2K people like this**

## 0 comments:

## Post a Comment

Let us know your responses and feedback