World's most popular travel blog for travel bloggers.

[Solved]: Complexity of multiplication

, , No Comments
Problem Detail: 

I've been reading around the area of complexity and arithmetic operations using logic gates; one thing that is confusing me is that \begin{equation} \Theta (n^{2}) \end{equation} is quoted as being the complexity for multiplication for iterative adition. But addition of a number requires \begin{equation} log_2(n) \end{equation} operations, 1 for each bit or 8 times that for each nand gate involved in doing this. So it strikes me as obvious that adding that number n times will have a complexity of \begin{equation} n \log_2(n) \end{equation} Which is definitely less than \begin{equation} \Theta (n^{2}) \end{equation} So where is this additional factor of \begin{equation} \frac{n}{\log_2(n)} \end{equation} coming from?

Asked By : Duke of Sam

Answered By : Yuval Filmus

Addition of a number of size $n$ takes time $O(n)$. Don't confuse a number and its encoding size, which is logarithmically smaller.

When multiplying an $n$-bit integer $a$ by an $n$-bit integer $b$ using the iterative addition algorithm, you are adding up to $n$ shifted copies of $a$. Each addition costs you $O(n)$ rather than $O(\log n)$. The numbers $a,b$ itself could be as large as $2^n$.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/52640

0 comments:

Post a Comment

Let us know your responses and feedback