World's most popular travel blog for travel bloggers.

[Solved]: Reasoning on Efficiency (2)

, , No Comments
Problem Detail: 

Two algorithms to solve a particular problem can have theur efficiency compared using the $O$ and $o$ notation. However, this is very crude method, and tells us no information on how more effective one is than the other.

Is there a yard stick that can be applied to ALL algorithms, with ALL time complexities, that can be used to evaluate the efficiency of two algorithms?

I actually developed a method for this, but it applies only to polynomial algorithms.

MY METHOD.

Suppose we want to compare two algorithms for accomplishing a task $f_i(n), g_i(n)$. $f_i(n) = o\left(g_i(n)\right)$
A simple way to compare $f_i$ and $g_i$, is to find their ratio $r_i(n)$
$$r_i(n) = \frac{g_i(n)}{f_i(n))}$$
Express $r_i(n)$ as a polynomial of ,$n$.
$$r_i(n) = n^{k}, \{k: k \in \Bbb R\}$$
$$m = \lceil{k}\rceil$$

$R_i(n) = m^{th}$ derivative of $r_i(n)$.
It follows that $R(n)$ is a constant.

My method works nicely for polynomial algorithms, but is completely useless for other time complexities. Is there a more effective yardstick? One that applies to all complexities?

Asked By : Tobi Alafin

Answered By : D.W.

The concepts of asymptotic running time analysis and Landau notation already provide a way to compare the asymptotic running time of two algorithms, as long as (a) you are happy to ignore constant factors, (b) only care about asymptotics (limiting behavior as problem size gets large), and (c) you want to evaluate them according to their worst-case running time. It sounds like you are happy with (a), (b), and (c), so you should be happy with existing tools. They do tell you how much more efficient one algorithm is than the other. You just might not have appreciated yet how to fully use these mathematical tools and concepts, which is understandable.

In particular, if we can express the running time of algorithm A as $\Theta(f(n))$ and the running time of algorithm B as $\Theta(g(n))$, we already know how to compare them. We know that the ratio of their running time is $\Theta(f(n)/g(n))$. In particular:

  • If $f(n)/g(n) = \Theta(1)$, we know that their asymptotic worst-case running times are equivalent.

  • If $f(n)/g(n) = o(1)$ (i.e., $f(n) = o(g(n))$), we know that A is asymptotically faster, and we can measure asymptotically how much faster: it is faster by a factor of $\Theta(f(n)/g(n))$.

  • If $f(n)/g(n) = \Omega(1)$ (i.e., $f(n) = \Omega(g(n))$, or equivalently, $g(n) = o(f(n))$), we know that B is asymptotically faster, and we can measure asymptotically how much faster: it is faster by a factor of $\Theta(g(n)/f(n))$.

You might want to read How to come up with the runtime of algorithms?, How does one know which notation of time complexity analysis to use?, Sorting functions by asymptotic growth, How do O and Ω relate to worst and best case?, and Justification for neglecting constants in Big O. That will give you some essential background to help you understand how these concepts fit in.

(I'm simplifying a bit, and casting this at the level that it sounds like you're ready to absorb. Once you understand the basics, read Are the functions always asymptotically comparable? if you want to read some more precise statements -- but beware that right now this might just cause more confusion than it's worth.)

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/67080

3.2K people like this

 Download Related Notes/Documents

0 comments:

Post a Comment

Let us know your responses and feedback