The answers to this question on Crypto Stack Exchange basically says that, to measure the complexity of the logarithm problem, we have to take the length of the number representing the size of the group into account. It seems arbitrary, why don't we chose the size of the group as the argument? Is there a criterion to know what argument to chose? In fact, I know I overlooked something important since the complexity changes hugely if we do it by the size of the group.
Asked By : Nassim HADDAM
Answered By : Yuval Filmus
It doesn't matter whether you choose the size of the group $|G|$ or the size of the integer representing it $n$ as a parameter, since $n \approx \log |G|$. There are two reasons that usually the complexity is described in terms of $n$ rather than $|G|$:
$n$ is the length of the input (more accurately, the input has length $\Theta(n)$), and we usually measure the complexity of algorithms as a function of the input length.
Usually $n$ is a small number such as $1024$, whereas $|G|$ is a huge number such as (roughly) $2^{1024}$.
Best Answer from StackOverflow
Question Source : http://cs.stackexchange.com/questions/64961
0 comments:
Post a Comment
Let us know your responses and feedback