World's most popular travel blog for travel bloggers.

[Solved]: Sorted-greedy for Load Balancing Problem

, , No Comments
Problem Detail: 

In load balancing problem we have $m$ machines and $n$ jobs, each taking processing time $t_j$. Total processing time on the machine $i$ is $T_i =\sum_{j\in A(i)}{t_j}$, where $A(i)$ is the set of jobs assigned to machine $i$. Goal is to provide an assignment of jobs, such that $\max T_i$ is minimized.

It is known that sorted greedy algorithm is only approximation solution for load balancing problem. But I can't find an instance of problem, when it actually doesn't find an optimum solution. Can someone provide an example when this algorithm gives a wrong answer?

Asked By : Ildar

Answered By : Gamow

Take for instance $m=2$ machines and $n=5$ jobs with processing times $4,3,3,2,2$.

  • The optimal schedule puts $4+3$ on one machine and $3+2+2$ on the orther machine.

  • Sorted-greedy assigns $4$ to the first machine, $3$ to the second machine, $3$ to the second machine, $2$ to the first machine, $2$ to the first machine. Hence the loads are $4+2+2$ and $3+3$.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/37621

0 comments:

Post a Comment

Let us know your responses and feedback