Scheduling under Uncertainty

2 downloads 0 Views 246KB Size Report
Frank Werner. Otto-von-Guericke-University, Faculty of Mathematics, PSF 4120, 39016 Magdeburg, Germany frank.werner@ovgu.de. Keywords: uncertainty ...
1

Scheduling under Uncertainty Frank Werner Otto-von-Guericke-University, Faculty of Mathematics, PSF 4120, 39016 Magdeburg, Germany [email protected]

Keywords: uncertainty, interval processing times, stochastic approach, robust approach, fuzzy approach, stability approach, shop scheduling problems, single machine problems 1

Introduction

The usual assumption that the processing times of all operations of a scheduling problem (or other data such as setup times, release dates or due dates) are exactly known before scheduling restricts practical aspects of modern scheduling theory since it is often not valid for real-world processes. They may be given either as random variables, fuzzy numbers, or they may be uncertain (i.e., they may take any values from given intervals). For different objective functions, algorithms were developed which are based on a stochastic approach, on a fuzzy approach, on a robust approach and on a stability approach (a stability analysis of an optimal schedule with respect to variations of the input data). A stochastic (Pinedo 2002) or a fuzzy method (Slowinski and Hapke 1999) are used when the processing times may be defined as random variables or as fuzzy numbers. If these times may be defined neither as random variables with known probability distributions nor as fuzzy numbers, other methods are needed to solve a scheduling problem under uncertainty (Daniels and Kouvelis 1995, Sabuncuoglu and Goren 2009, Sotskov et. al. 2010b). In an uncertain model, it is assumed that in the realization of a schedule a processing time (or another numerical parameter) may take any real value between the lower and upper bounds given before scheduling. Obviously, a deterministic model is a special case of such a model (namely, if for any processing time, the given lower and upper bounds are identical). The uncertain model can also be interpreted as a stochastic one under strict uncertainty when there is no sufficient a priori information about the probability distributions of the random processing times. If the input data are uncertain, the whole scheduling problem may be decomposed into two (or more) sequential scheduling problems (phases). At the off-line phase, a set of potentially optimal schedules has to be constructed under the conditions of uncertainty of the given numerical input data. It is assumed that only lower and upper bounds for the activity duration are known at the first (off-line) phase of scheduling. For solving a scheduling problem with uncertain input data one can use e.g. a stability analysis of an optimal schedule with respect to the possible variations of the given parameters. Since the “price” of time is not high at the off-line phase of scheduling, a decision-maker can use even a time-consuming algorithm for solving such a problem with uncertain numerical data. When some activities will be realized, a decision-maker will have more reliable information which may be used to find an optimal schedule. So, at the on-line phase of scheduling, it is necessary to choose a schedule (from the set of potentially optimal schedules constructed at the off-line phase) which has to be realized in an optimal way. To solve a scheduling problem at the on-line phase, a decision-maker needs to use fast polynomial-time algorithms. 2

Approaches for dealing with inaccurate data

In this section, we briefly review the approaches for dealing with inaccurate data. In a stochastic model, particular scheduling parameters (e.g. the processing times) are assumed

2

to be random variables with known probability distributions. Two types of stochastic problems are considered, namely stochastic job problems (the processing times are random variables) and stochastic machine shop problems (the processing times are fixed but the job completion times are random variables due to machine breakdowns or other reasons of machine non-availability). We refer to the second part of Pindeo (2002) for results on stochastic problems. In a fuzzy approach, the job processing times or other numerical data are given as fuzzy numbers. Fuzzy scheduling techniques either fuzzify directly existing scheduling rules or solve mathematical programming problems to determine optimal schedules. There were given necessary optimality conditions of fuzzy counterparts of classical scheduling rules such as SPT or EDD. Slowinski and Hapke (1999) collect the main works in this field one decade ago. The following two approaches consider uncertain or interval processing times. The robust method (Daniels and Kouvelis 1995, Kasperski 2005, Kasperski et. al. 2012) assumes that the decision-maker prefers a schedule hedging against the worst-case scenario. In a robust approach, a solution is chosen using a particular robust criterion, e.g., the min-max criterion or the min-max regret criterion. In the first case, one determines a solution minimizing the largest cost over all scenarios T . The min-max regret is a less conservative criterion, where one determines a solution minimizing the largest deviation from the optimum over all scenarios T . This criterion was applied to several scheduling problems within the last decade, e.g., to problems where the deterministic versions are polynomially solvable. Kasperski (2005) gives a polynomial algorithm of complexity O(n4 ) for the problem 1|prec|fmax both with interval processing times and due dates of the n jobs of a set J if the maximal regret criterion is applied. Kasperski et. al. (2012) apply the robust approach to the problem F 2||Cmax with interval processing times. It was shown that even the bounded case of the min-max regret problem with only two possible scenarios is strongly NP-hard. Moreover, several approximability results are given in the existing literature. The stability method (Lai and Sotskov 1999, Lai et. al. 1997, Sotskov et. al. 2009, Sotskov et. al. 2010a, Sotskov et. al. 2010b, Sotskov and Lai 2012) combines a stability analysis, a multi-stage decision framework and the solution concept of a minimal dominant set of semi-active schedules. This approach is briefly sketched in the next section. 3

The stability approach

Sotskov et. al. (2010b) discuss the application of this approach to the job (and general) shop problem with the makespan and the mean flow time criteria. The strategy of this approach consists in separating the structural input data from the numerical input data. The structural input data (precedence and capacity constraints) are represented by a mixed graph G which completely defines the set of semiactive schedules. The approach presented is based on an improved stability analysis of an optimal digraph. A set of digraphs λ∗ (G) is called a G-solution for the job shop scheduling problem with interval processing times, if for each vector p of the scenario set T , the set Λ∗ (G) contains at least one optimal digraph. If any proper subset of the set λ∗ (G) is no longer a solution of the scheduling problem, it is called a minimal G-solution. The analysis is based on the notion of the relative stability radius, where the relativity is defined both with respect to a polytope T and a set B of digraphs. Then an algorithm SOL for determining a G-solution and an algorithm MINSOL for determining a minimal G-solution (both for the makespan and the mean flow time criteria) are presented which are based on a repeated calculation of relative stability radii. Using the introduced dominance relation between digraphs, a characterization of a G-solution for the job shop problem is derived by presenting necessary and sufficient

3

conditions for a set of digraphs to be a G-solution. Algorithm SOL generates a covering of polytope T by nested closed balls with the common center p ∈ T and different radii which are relative stability radii of the same digraph but for different nested sets. Algorithm MINSOL is based on the repeated application of algorithm SOL for excluding redundant digraphs. Then three different algorithms are given which generate the set B of digraphs used as input for algorithms SOL and MINSOL. Two of these algorithms are of the branch and bound type. For the job shop problem with minimizing the makespan and mean flow time, respectively, computational results for determining an exact G-solution are presented for instances with up to 24 operations. In addition, for instances with up to 50 operations and a different degree of uncertainty, heuristic G-solutions are constructed. The stability approach was also applied to several scheduling problems, where the deterministic versions allow a polynomial-time solution. In Sotskov et. al. (2010b), additional results for the two-machine flow (and job) shop problem with interval processing times are presented. Along with the off-line scheduling problem, also the on-line problem is considered. It is shown how to use the additional information available for the on-line problem to obtain a better solution than that constructed for the off-line version of the problem. All the conditions proven may be tested in polynomial time. Computational results both for the off-line and on-line phases are given for instances with up to 1000 jobs. The stability approach was also implemented e.g. for the single machine mean flow time problem with interval processing times (Sotskov et. al. 2010a, Sotskov et. al. 2011, Sotskov and Lai 2012). In particular, the stability box within a set of scenarios T was defined for a permutation π of the n jobs. For any scheduling instance, the stability box is a subset of the stability region, but it is easier to compute. In Sotskov and Lai (2012), an O(n log n) algorithm is developed for calculating a stability box for a fixed permutation, and an O(n2 ) algorithm is developed for selecting a permutation in the set S of feasible sequences with the largest dimension and volume of a stability box. The efficiency of these algorithms is demonstrated on a set of randomly generated instances with 10 ≤ n ≤ 1000. In Sotskov et. al. (2011), the complexity of finding such a permutation is reduced to O(n log n). All these algorithms use the precedence-dominance relation on the set of jobs J and the solution concept of a minimal dominant set S(T ) ⊆ S: Job Ju dominates job Jv , if there exists a minimal dominant set S(T ) for this problem such that job Ju precedes job Jv in every permutation of the set S(T ). Then one can prove a necessary and sufficient condition when a job Ju dominates a job Jv . Using this condition, one can obtain a compact presentation of the set S(T ) as a digraph (J , A) by checking the dominance condition for each pair of jobs from J and constructing the dominance digraph (J , A) in O(n2 ) time, where (Ju , Jv ) ∈ A if and only if Ju dominates Jv . Useful properties of the dominance graph for the two-machine flow-shop makespan problem are presented in Matsveichuk et. al. (2011). 4

Concluding remarks

In spite of a large number of papers and books about scheduling, the utilization of the results from scheduling theory in most production environments is far from the desired volume. One of the reasons for such a gap between scheduling theory and practice is connected with the usual assumption that the processing times of the jobs are known exactly before scheduling. One is forced to recognize that we are only at the beginning of the road to develop mathematical tools for dealing with scheduling problems with uncertain parameters. It is obvious that there is no unique method suitable for all different types of uncertainties arising in the real world. Various measures of uncertainties are introduced in a recent paper (Sotskov et. al. 2012) which support the decision-maker in selecting an appropriate method.

4

For the stability approach, it is a challenging task to develop more efficient algorithms, particularly for those uncertain problems whose deterministic counterparts allow polynomial-time algorithms. The application of the stability analysis within an implicit enumeration framework may also be a subject of further research. Currently, most results are available for the makespan and total flow time criteria and thus, the investigation of other criteria is of interest as well. In addition, for the on-line problem, only very preliminary results are available. Acknowledgements The author is grateful to all of his collaborators in the area of scheduling under uncertainty, particularly to the co-authors Prof. Yuri Sotskov, Dr. Nadezhda Sotskova and Prof. Tsung-Chyan Lai of the book with the same title for the long-term cooperation in this field. References Daniels, R. and P. Kouvelis, 1995, “Robust scheduling to hedge against processing time uncertainty in single stage production", Management Science, Vol. 41 (2), pp. 363-376. Kasperski, A., 2005, “Minimizing maximal regret in the single machine sequencing problem with maximum lateness criterion", Operations Research Letters, Vol. 33, pp. 431-436. Kasperski, A., A. Kurpisz and P. Zielinski, 2012, “Approximating a two-machine flow shop scheduling under discrete scenario uncertainty", European Journal of Operational Research, Vol. 217, pp. 36-43. Lai, T.-C. and Y. Sotskov, 1999, “Sequencing with uncertain numerical data for makespan minimization", Journal of the Operations Research Society, Vol. 50, pp. 230-243. Lai, T.-C., Y. Sotskov, N. Sotskova and F. Werner, 1997, “Optimal makespan scheduling with given bounds of processing times", Mathematical and Computer Modelling, Vol. 26(3), 67-86. Matsveichuk, N., Y. Sotskov and F. Werner, 2011, “The dominance digraph as a solution to the two-machine flow-shop problem with interval processing times", Optimization, Vol. 60 (12), pp. 1493-1517. Pinedo, M., “Scheduling: Theory, Algorithms, and Systems", Prentice-Hall, Englewood Cliffs, NJ, USA. Sabuncuoglu, I. and S. Goren, 2009, “Hedging production schedules against uncertainty in manufacturing environment with a review of robustness and stability research", International Journal of Computer Integrated Manufacturing, Vol. 22 (2), pp. 138-157. Slowinski, R. and M. Hapke, 1999, “Scheduling under Fuzziness", Physica-Verlag, Heidelberg, Germany, New York, USA. Sotskov, Y., N. Egorova and T.-C. Lai, 2009, “Minimizing total weighted flow time of a set of jobs with interval processing times", Mathematical and Computer Modelling, Vol. 50, pp. 556-573. Sotskov, Y., N. Egorova and F. Werner, 2010a, “Minimizing total weighted completion time with uncertain data: A stability approach", Automation and Remote Control, Vol. 71 (10), pp. 2038-2057. Sotskov, Y., N. Egorova, T.-C. Lai and F. Werner, 2011, “The stability box in interval data for minimizing the sum of weighted completion times", in: Proceedings of SIMULTECH 2011, Noordwijkerhout, The Netherlands, pp. 14-23. Sotskov, Y. and T.-C. Lai, 2012, “Minimizing total weighted flow time under uncertainty using dominance and a stability box", Computers & Operations Research, Vol. 39, pp. 1271-1289. Sotskov, Y., N. Sotskova, T.-C. Lai and F. Werner, 2010b, “Scheduling under Uncertainty. Theory and Algorithms", Belorusskaya nauka, Minsk, Belarus. Sotskov, Y., A. Wagelmans and F. Werner, 1998, “On the calculation of the stability radius of an optimal or an approximate schedule", Annals of Operations Research, Vol. 83, pp. 213-252. Sotskov, Y., T.-C. Lai and F. Werner, 2012, “Measures of problem uncertainty of scheduling problems with interval processing times", Manuscript, OvGU Magdeburg (under submission).