World's most popular travel blog for travel bloggers.

[Solved]: How do we derive the runtime cost of Karatsuba's algorithm?

, , No Comments
Problem Detail: 

I've read the Wikipedia article explaining the complexity analysis of the Karatsuba algorithm, but I'm not fully grasping it. I seem to have gotten about 75% of the way to the solution on my own, but lack the last 25% to fully understand why the complexity is $O(N^{\log3})$ where the log function is base $2$.

I worked out myself that the recursion tree has height $\log N$ (easy to see), and I understand the coefficient $3$, which can trade places with $N$ due to the properties of logs, comes from the fact that each level of recursion results in three leaves per node of the tree.

I think the main thing that's tripping me up is I'm that not clearly seeing why $\log N$ can be expressed as the exponent of $3$, although I see where $\log N$ and $3$ come from, and I see how in the next step you can swap $N$ and $3$ to get $N^{\log3}$ (something I learned in highschool).

To put this another way, I've been unable to work this problem out to visualize sequentially how this formula plays out for the algorithm. I.E, without already knowing the Master theorem, how does one derive this formula (I assume it's possible) by simply walking through the Karatsuba algorithm and watching the patterns emerge.

UPDATE: In writing out my understanding of the wikipedia explanation line by line to identify where I got stuck, I finally saw the pattern. However, maybe putting it here will help someone else.

Asked By : Josiah

Answered By : Josiah

"Karatsuba's basic step works for any base B and any m, but the recursive algorithm is most efficient when m is equal to n/2, rounded up."

I understood this to mean that for integers with n digits, m is the ceiling of half n (m being the exponent applied to the Base in the algorithm). This can be seen if you simply look at the definition of the algorithm.

"In particular, if n is $2^k$, for some integer k, and the recursion stops only when n is 1, then the number of single-digit multiplications is $3^k$, which is nc where c = ${\log3}$"

I took this to mean, given n is some power of 2 (2, 4, 8, ...), the number of single digit multiplications performed is 3 * (exponent of 2). I worked out how this conclusion was reached by writing out the tree and finding that each sub-problem involves 3 multiplications, and at the base of the recursive tree, you should have $3^{m - 1}$ sub-problems for that bottom level. However, the -1 can be ignored as, using the properties of exponents, you can factor that out into a constant, which Big-O doesn't consider. Therefore, if we are looking for the exponent of 2, we can get that exponent of 3 as a log relative to n, thus $3^{\log n}$, which by a law of logarithms turns into $n^{\log 3}$. Since the computational complexity of each sub-problem by itself is a constant 7 (4 additions and 3 multiplications) independent of the number of digits in the number representation, the computation itself can be ignored in the efficiency analysis, giving us $O(n^{log 3})$

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/40216

0 comments:

Post a Comment

Let us know your responses and feedback