World's most popular travel blog for travel bloggers.

Runtime of algorithm to transpose a graph

, , No Comments
Problem Detail: 

I am working with an algorithm to compute the transpose of a directed graph. Here is a rough overview of the algorithm: For each vertex in the graph, traverse each edge and add an edge from the neighbor to the parent node. The adding of an edge takes constant time, $O(1)$. And I figured, if I am traversing each edge, then the asymptotic run time should be $O(|E|)$. However, the run time of this algorithm is $O(|V| + |E|)$. The only way I can make sense of this is: once it reaches the final level of the tree (the roots), even though the nodes don't have any edges, the for loop will still iterate through those nodes. However, if I am taking this into account, then shouldn't it be $O(|V|)$. Why are we taking both $|V|$ and $|E|$ into account? Can someone can offer some insight to this?

Asked By : Christian
Answered By : Yuval Filmus

It all depends on how exactly you store the graph. For example:

  • If you store the graph as an adjacency matrix, the complexity is $\Theta(|V|^2)$.
  • If you store the graph as a linked list of edges per vertex (i.e., adjacency lists), then the complexity is $\Theta(|V|+|E|)$.
  • If you store the graph as a linked list of all edges (and don't store the vertices explicitly), then the complexity is $\Theta(|E|)$.

The representations usually used are the first two. I have never heard of the third being used. In most cases, the number of edges exceeds the number of vertices, and so there would be the third method would have no advantage over the second one, but it would have many disadvantages.

Best Answer from StackOverflow

Question Source :

3200 people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback