Parallel machine scheduling using Lagrangian relaxation - Computer ...

4 downloads 28631 Views 367KB Size Report
This paper presents a two-level optimization methodology for scheduling independent jobs with due dates on parallel machines (resources). The jobs may haveĀ ...
PARALLEL MACHINE SCHEDULING USING LAGRANGIAN RELAXATION

Peter B. Luh*, Debra J . Hoitomt**, Eric Max**, Krishna R. Pattipati*

abstraction of a real production environment. Unfortunately, the problem is NP-hard (Lawler, et al. (1982)), and no polynomial time algorithms are known to exist.

ABSTRACT This paper presents a two-level optimization methodology for scheduling independent jobs with due dates on parallel machines (resources). The jobs may have different penalties for missing their respective due dates and may require the use of the resource for varying lengths of time. A Lagrangian relaxation technique is applied to a constrained integer programming formulation of the problem. A decomposition by job of the dual problem serves to simplify the solution at the low level. The high level problem is then solved via a subgradient method. In general, the resulting solution in the dual space is associated with an infeasible solution in the primal space. A heuristic approach is developed to generate a feasible solution, using the solution to the dual problem and the concept of "degree of freedom" in scheduling each job. On two problems tested, the feasible solutions generated by the heuristic are within 0.5% of the dual optima. The concept of "degree of freedom" can be used to study the scheduling of new jobs. Furthermore, since Lagrange multipliers contain sensitivity information, our methodology provides a natural vehicle to study the interaction of jobs as they compete for limited resources. The method can also be extended to general job shops.

An integer programming formulation of the problem is presented in Section 2. In Section 3,the Lagrangian relaxation technique is used to decompose (by job) the scheduling problem, thus simpiifying the solution procedure. Low level subproblems are solved by using a simple selection rule, whereas the high level problem is solved via a subgradient method. In general, the resulting solution in the dual space is associated with an infeasible solution in the primal space. A heuristic approach is developed to generate a feasible solution using the solution to the dual problem, and the concept of "degree of freedom" in scheduling each job. Numerical testing results are presented in Section 4. On the two problems tested, the solutions are within 0.5% of the dual optimum. The concept of "degree of freedom" can be used to study the scheduling of new jobs. Furthermore, since Lagrange multipliers contain sensitivity information, our methodology provides a natural vehicle to study the interaction of jobs as

they compete for limited resources. The method can also be extended to general job shops.

1. INTRODUCTION

2. PROBLEM FORMULATION

The problem of scheduling is one of the most impoItant planning and operational issues in production systems. In fact, productivity problems in the United States are, in large part, attributable to problems in the scheduling function: not having the right items when they are needed, not having equipment available when it is needed (Meredith and Gibbs (1984)).

Many scheduling formulations exist in the literature. The following discrete-time, integer programming formulation is due to Bruvold and Evans (1985) in some variabIe definitions; Norbis and Smith (1986) in some constraint statements; and generally to Everett (1963),Fisher (1973),and Contemo and Ho (1987). We first defiie the following variables:

In this paper, we consider the non-preemptive scheduling of independent jobs with due dates on identical parallel machines (resources). For each job, the resource is required for a specified period of time, and a penalty is applied when the due date is not met. This problem description is an

6ik

integer variable equal to one if job i is active at time k

ojk

integer variable equal to one if job i is active at time k but not k+l (job completion time = k)

* Department of Ekctrical & Syatcms Engineering. University of Connecticut, Stom, CT 06268. ** Prau & Whitncy. Faat Hartford, CT 06108.

Ci

completion time of job i

0-8186-0888-9/88/oooO/0244/$01.000 1988 IEEE

~

-

~

244

3. SOLUTION METHODOLOGY

Dj

due date of job i

Mk

units of resource available at time k

N

number of jobs

ti

time required from the resource by job i

T

time horizon under consideration

w;

weight (or importance) of job i

z

objective function, to be optimized

3.1 The Lagrangian Relaxation Approach to the Scheduling Problem Using Lagrangian relaxation as an approach to obtaining a set of decomposed subproblems was pioneered by Everett (1963). Fisher (1973) recognized the potential specifically for scheduling problems, while Geoffrion (1974) further developed the theoretical aspects of integer problems and corresponding solution methodology. Recently, Sandell, et al. (1982), Contemo and Ho (1987) and Gavish (1987) exploited the decentralized structure of the problem in obtaining solutions for related problems.

We assume that job scheduling is non-preemptive so that a contiguous block of time of length ti is needed to process job i. Therefore, the following equation holds by definition:

For notational simplicity, defiie (3.1)

Ci=xkUjk (i=1,2,...,N).

(2.1)

For notational convenience, we define x I ( 6 , ~E)X ,where

(3.2)

((),I)).

(3.3) R :

where U(Ci-Di-l)=l if Ci-Di-120, and zero otherwise. The above objective function accounts for the weight of jobs, the importance of meeting due dates, and the fact that a job becomes more critical with each time unit after passing its due

(3.4)

date.

A static and deterministic parallel machine scheduling problem can now be formulated as follows:

= I;t wi e (C,-Dj-l ) u(ci-Dr-l)v

(2.4)

I;

"(-xnkkf&+ m i n I : ( W j f ( C ; P i ) - f I : n k ~ , k ) n k x.FX i k

subject to (3.4); which leads to the following decomposed subproblem for each job i (given n): (3.6) Ri:

subject to

subject to n20 and eq. (2.5).

Then the dudproblemis

(3.5) D :

XEX

z = T;Wjf(Cipi).

To decompose the problem according to jobs, we relax the resource availability constraint (eq. (2.4)) to form the relaxed problem (here, we have also used eq. (3.2)).

(2.2)

Min z , with z

k

I

One particular objective function of interest is

(2.3) P:

k

It can be shown that the objective function can be rewritten as

k

x= ( ( 6 , O ) 6j&,ajk : &

f ( C j P i ) = [eXp(CkOik-Di-l)] U(ZkWjk-Di-I).

Min ( W j f ( c j P i ) + p k 6 j & ) ; X

Resource availability: < M k (k=1,2,...,T ) ,and

(3.7)

U

subject to f6ikk=tj.

I

(2.5)

Here, x& is the sensitivity of the objective function with respect level of resource at time k,M k .

Processing time: );ai& (i=l,2,...N ) .

to the

k

245

3.2 The Optimal Dual Solution

Recall that job i is scheduled to s t m at time slot 1 if Sir is the smallest sum over all I (Sip). If multiple solutions exist, there may be several time slots I such that Si1 is equal to Sip.

The above derivation presents a framework for solving the scheduling problem. There are several steps to obtaining an optimal solution: solving the subproblems, solving the dual problem, constructing a feasible solution, and finding an optimal solution. We shall briefly discuss each of them.

For an E > 0 defme (3.11)

For subproblem Ri ,if job i begins on a resource at time I and ends at time l+ti-l, then 6&=1 for k=l,l+l, ...,l+ti-l, and zero for all other k. Also, q(l+ti-ll=l, and zero for a l l other k. Problem Ri is thus reduced to Z nk ). k=l

(wif(f+ii-l,Di)c

Min

lSKT-ri-l

I+ri-l

To solve subproblem (3.8), the value of

Z Ak for each I is

k=l

computed; when I+ri-l>Di, the penalty term f(l+ti-l,Di) is added. The I that yields the smallest sum is the solution to (3.8). Mathematically, define

The heuristic procedure beghs with a relatively large E and schedules all jobs with Ji(&)=l. After these jobs have been scheduled, resource constraints at some time slots may have been reached. Remove those time slots from all sets Li(&), update J ~ ( E )and , schedule any jobs where J ~ ( E = ) 1. If no feasibility problems arise (i.e., constraint (2.4) is satisfied), decrease E and schedule any new jobs emerging with Ji (E) =1, and the process repeats. If at some point feasibility issues arise, the backtracking step is then taken.

I+tz-l

(3.9)

Sir= Z Ak+wif(l+ti-lpi) k=I

as the cost of starting job i at time slot I . Then

(3.10)

I*=Arg (

Min

lSIST-rj-l

Si1

Let Ji(&) be the the number of elements in the set L~(E).J ~ ( E ) signifies the degree of freedom in scheduling job i (with relative cost difference bounded by E). If J ~ ( E=) 1, then there is a unique time slot to schedule job i. On the other hand, if Ji(&) > 1, job i can be scheduled at several time slots with relative cost difference bounded by E. We thus have some flexibility in selecting a time slot 1 E Li(E) to generate a good decreases, feasible schedule. Note that as E decreases, J~(E) ) any 00. and I* E L ~ ( Efor

1+ti-1

(3.8)

SilSil*