World's most popular travel blog for travel bloggers.

Recursive equation for complexity: T(n) = log(n) * T(log(n)) + n

, , No Comments
Problem Detail: 

For analyzing the running time of an algorithm , I'm stuck with this recursive equation : $$ T(n) = \log(n) \cdot T(\log n) + n $$ Obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation.

I'm pretty sure that it should be solved with a change in the parameters, like considering $n$ to be $2^m$, but I couldn't manage to find any good fix.

Asked By : Ashkan Kzme

Answered By : A.Schulz

I suppose you are looking for an asymptotic bound. Notice that the recursion depth is $\log^* n$, that is how often do I have to apply the logarithm recursively to get below 2. Also, the function is increasing. Using these two facts, you can plug in the recursion once and then you see that you have at most $\log^*$ summands, each of them at most $\log^2 n$ and then the additional $n$. This sum is dominated by $n$.

$$ \begin{align} T(n)&=\log(n) T(\log n)+n=\log n \log\log n \;T(\log\log n)+\log^2(n)+n \\ &\le \log^*n\log^2n+n=O(n) \end{align} $$

Clearly $T(n)\ge n$ and thus $T(n)=\Theta(n)$.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/14775

0 comments:

Post a Comment

Let us know your responses and feedback