World's most popular travel blog for travel bloggers.

Computational complexity of coefficients

, , No Comments
Problem Detail: 

I have some recurrence relationships I use to compute some coefficients of a series. I want to know what the time complexity of computing these is. Suppose I know the coefficients $a_0,...,a_n$, and I want to compute $c_0,...,c_n$ as follows

$ c_{n}=\begin{cases} e^{b_{0}}, & n=0\\ \frac{1}{n}\sum_{k=1}^{n}kb_{k}c_{n-k}, & n\geq1 \end{cases}$


$b_{n}=\begin{cases} \frac{1}{a_{0}}, & n=0\\ -\frac{1}{a_{0}}\sum_{i=1}^{n}a_{i}b_{n-i}, & n\geq1 \end{cases}$

So far this is what I have. If we know $b_0,...,b_{k-1}$ then $b_k$ can be computed in $O(k)$ time, so to compute $b_0,...,b_{n}$ takes $\sum_{k=0}^{n}O(k)=O(n^2)$ time.

Now suppose that we know $b_0,...,b_{k}$ and $c_0,...,c_{k-1}$ then it takes $O(k)$ time to compute $c_k$. Thus to compute $c_0,...,c_{n}$ it takes $\sum_{k=0}^{n}O(k)=O(n^2)$. Ofcourse we have to take into account the time it takes to compute $b_0,...,b_{n}$ which is $O(n^2)$.

So can I conclude that the time it takes to compute $c_0,...,c_{n}$ is


Asked By : mark

Answered By : Yuval Filmus

You also need to take into account the size of the coefficients. Adding integer coefficients of bit-length $\ell$ takes time $O(\ell)$, and multiplying them takes time $\tilde{O}(\ell)$, i.e. $O(\ell (\log \ell)^k)$ for some $k$ (any $k > 1$ will do). Your case is slightly more complicated, since the answer is rational and uses the "indefinite" value $e^{1/a_0}$, so in general it is going to be a polynomial in $e^{1/a_0}$ with rational coefficients. This further increases the running time.

Best Answer from StackOverflow

Question Source :

3200 people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback