Job Shop Scheduling: Hybrid Intelligent Human-Computer ... - CiteSeerX

15 downloads 7346 Views 2MB Size Report
A paradigm for hybrid intelligent human-computer scheduling is developed through Cognitive Work Analysis, based on a field study of a scheduler in a small job ...
Job Shop Scheduling: Hybrid Intelligent Human-Computer Paradigm

by Peter Gerard Higgins

March 1999

Submitted in total fulfilment of the requirements of the degree of Doctor of Philosophy

Department of Mechanical and Manufacturing Engineering The University of Melbourne

Abstract The thesis challenges the general perception that job shop scheduling is a combinatorial exercise of allocating jobs to machines and sequencing operations on each machine. In practice, human schedulers produce functional schedules bound by many constraints, both qualitative and subjective, and, quantitative and objective. A paradigm for hybrid intelligent human-computer scheduling is developed through Cognitive Work Analysis, based on a field study of a scheduler in a small job shop. Hybrid Human-Computer Intelligent Automation (HHCIA), in which human and machine ‘intelligence’ are combined, is related to the established models for supervisory control of continuous processes. ‘Hybrid’ is used to signify human and machine intelligence combining cooperatively. HHCIA applied to both continuous and discrete manufacture is shown to consist of two elements: the control of the processing of batches of work and the control of the changeover of batches. Scheduling is shown to be central to the ‘hybrid’ automation of discrete manufacture. Planning the changeover of batches has a predominant place in HHCIA of discrete manufacture. As the job-shop environment is perceived to be most difficult to schedule, it is used to explore a hybrid intelligent human-computer paradigm for the scheduling component of HHCIA. This leads to development of a methodology for designing a hybrid intelligent production scheduling system.

ii

Declaration

This is to certify that (i)

the thesis comprises only my original work,

(ii)

due acknowledgment has been made in the text to all other material used,

(iii)

the thesis is less than 100,000 words in length, exclusive of tables, maps, bibliographies, appendices and footnotes.

Peter Gerard Higgins

iii

Acknowledgments

This thesis is dedicated to my wife Feodora Fomin and children Lucy and Gemma. The research would not have been possible without their love, patience and sacrifice.

I wish to thank my supervisors, Dr. Andrew Wirth and Prof. Malcolm Good, and my colleagues at SCHIL (Swinburne Computer-Human Interaction Laboratory). Malcolm encouraged me to follow my own interests instead of following wellworn paths. Andrew provided dedicated and persistent support over the long haul and ensured that intellectual rigour was maintained. My colleagues at SCHIL provided the necessary environment for my understanding of Cognitive Work Analysis to mature. In particular, I wish to thank Peter J. Benda of SCHIL for his scrupulous proof reading and suggested amendments to bring clarity to the text.

iv

Table of Contents List of Figures ......................................................................................... ix List of Tables ......................................................................................... xiv

Chapter 1 Introduction ............................................................... 1 1.1

Control of Manufacturing Systems.............................................................. 1

1.2

Research Question ....................................................................................... 3

1.3

Thesis Overview .......................................................................................... 3

Chapter 2 Hybrid Human-Computer Intelligent Automation of Discrete Manufacture............................................................................... 6 2.1

Supervisory control...................................................................................... 6 2.1.1

Two Distinct Forms of Control ................................................................12

2.2

Comparison of Continuous and Discrete Manufacture ............................. 16

2.3

Hybrid Intelligence .................................................................................... 20 2.3.1

The Association with Supervisory Control ..............................................20

2.3.2

Hybrid Intelligence in Planning ...............................................................22

2.4

Hybrid Intelligent Scheduling ................................................................... 25

2.5

Summary.................................................................................................... 27

Chapter 3 The Potential for Hybrid Intelligent Production Scheduling.............................................................................................. 29 3.1

Constraints Define Manufacture................................................................ 29

3.2

Production Process as Constraints ............................................................. 30

3.3

Constraint Relaxation ................................................................................ 35

3.4

Heavy Relaxation ...................................................................................... 37 3.4.1

Performance Objectives ...........................................................................38

3.4.2

Finding the Most Appropriate Schedule ..................................................42

3.4.3

Heuristics .................................................................................................44

v

3.5

3.4.4

Non-Optimal Schedules ...........................................................................47

3.4.5

Problem Simplification ............................................................................48

3.4.6

Time Simplification .................................................................................48

3.4.7

Process Simplification..............................................................................53

3.4.8

Resources Simplification .........................................................................54

Light Relaxation ........................................................................................ 59 3.5.1

3.6

3.7

Artificial Intelligence ...............................................................................61

Perplexity................................................................................................... 63 3.6.1

Deep Knowledge and Constraint Relaxation ...........................................67

3.6.2

Uncertainty and Robustness.....................................................................69

3.6.3

Perplexity and the Need for a Paradigm Shift..........................................70

Human ....................................................................................................... 74 3.7.1

Difficulties Experienced by Human Schedulers ......................................77

3.8

Actively Engaged Human using Computer-Based Tools.......................... 78

3.9

Summary.................................................................................................... 80

Chapter 4 Human-Computer Interaction in Production Scheduling.............................................................................................. 82 4.1

4.2

4.3

Humans and the system ............................................................................. 84 4.1.1

Actions .....................................................................................................85

4.1.2

Modelling Purposeful Action...................................................................87

Cognitive Work Analysis .......................................................................... 89 4.2.1

Work Domain Analysis............................................................................90

4.2.2

Activity Analysis......................................................................................92

4.2.3

Human Supervisory Control in Continuous Process Control...................98

4.2.4

Model Human Scheduler........................................................................102

4.2.5

Multiple Mappings in Recognitional Decisions.....................................108

The MHS and Hybrid Human-Computer Decision Making ................... 110 4.3.1

Information through Patterns in Data.....................................................115

4.3.2

Representation in the Mind ....................................................................119

4.3.3

Mental Representation and the Environment .........................................123

4.4

Supporting Knowledge-Based Behaviour ............................................... 125

4.5

Mapping Surface Features to Situational Constraints ............................. 129

vi

4.5.1

Application of Mapping to Supervisory Control of Continuous Processes 130

4.6

Human-Computer Interface for an HIPSS............................................... 136

4.7

Summary.................................................................................................. 141

Chapter 5 Field Study............................................................. 143 5.1

Methodology............................................................................................ 144

5.2

Description of the Printing Environment................................................. 145

5.3

Characterisation of the Scheduling Problem ........................................... 152 5.3.1

Machine Loading Board.........................................................................157

5.3.2

Use of the Board for Scheduling............................................................162

5.4

The Goals Neil Seeks and the Rules he Applies ..................................... 163

5.5

Intensive Study of Scheduling................................................................. 185 5.5.1

Comparing the Schedule with the Production Record ...........................195

5.6

Summary of Findings from the Field Study ............................................ 200

5.7

Benchmarking Performance under Perplexity......................................... 203

5.8

Drawing the Threads of the Discussion Together ................................... 212

5.9

Conclusion ............................................................................................... 218

Chapter 6 Cognitive Work Analysis of Field Data................ 222 6.1

Domain Characteristics............................................................................ 222

6.2

Cognitive Work Analysis Applied to Discrete Manufacture................... 224

6.3

6.2.1

CWA Applied to the Field Study ...........................................................234

6.2.2

Dynamics of Goal Setting and Attainment ............................................249

Summary.................................................................................................. 254

Chapter 7 Hybrid Human-Computer Intelligent Scheduling256 7.1

Interactive Scheduling ............................................................................. 257

7.2

Drawing the Threads of the Discussion Together ................................... 267

7.3

Hybrid Intelligent Production Scheduling System .................................. 269 7.3.1

Features in the ProtoHIPSS....................................................................274

7.3.2

Relating the ProtoHIPSS to CWA .........................................................289

vii

7.3.3

7.4

Improving Performance .........................................................................298

The Contribution of HIPSS to Scheduling Practice ................................ 302 7.4.1

Centrality................................................................................................303

7.4.2

Cognitive Work Analysis.......................................................................304

7.4.3

Signification and the Display of Objects................................................307

7.4.4

Maintaining and extending expertise .....................................................308

7.5

Limitations that need addressing ............................................................. 308

7.6

Summary.................................................................................................. 310

Chapter 8 Conclusions, Reflections and Future Work........ 312 Bibliography ......................................................................................... 319

viii

List of Figures Figure 1. The primary loop in process regulation. ...................................................................... 7 Figure 2. The set point for process 2 is influenced by the output of other processes and human input. ............................................................................................................ 8 Figure 3. Strictest sense of supervisory control (Sheridan, 1987)............................................... 9 Figure 4. Broader sense of supervisory control (Sheridan, 1987)............................................. 10 Figure 5. Functional and temporal nesting of supervisory roles (Sheridan, 1987). ................ 10 Figure 6. Multi-loop model of supervisory control (Sheridan, 1987). ...................................... 20 Figure 7. Human interactive subsystem (HIS). .......................................................................... 22 Figure 8. Human-Computer Interactive Modes (Nakamura and Salvendy, 1994)................. 23 Figure 9 Constraints define the scheduling process................................................................... 30 Figure 10 Entity-Relationship for multiple operations ............................................................. 31 Figure 11 Relations represented by classes: Classes Operation and Machine represent operation and machine entities and class OperationMachine depicts the relation IS_PROCESSED_BY ............................................................................................ 32 Figure 12 When a job is allocate to a machine, an instance of OperationMachine is created, showing which machine is to process it and at what time................................... 33 Figure 13 Constraints from other OperationMachine instances on Operation3Machine1...... 33 Figure 14 The immediate predecessor to Operation3Machine1 establishes a constraining relation on Operation3............................................................................................ 34 Figure 15 Comparison of the flow of jobs for flow and job shops ............................................ 34 Figure 16 Relationship between constraints and feasibility...................................................... 36 Figure 17 The path of heavy relaxation ...................................................................................... 37 Figure 18 The OR model simplifies the real scheduling problem to one-machine.................. 55 Figure 19 Single-machine model for parallel machines ............................................................ 57 Figure 20 The path of light relaxation ........................................................................................ 59 Figure 21. Multi-loop model of supervisory control (Sheridan, 1987). .................................... 83 Figure 22. Human interactive subsystem (HIS). ........................................................................ 83

ix

Figure 23. Means-Ends Abstraction Hierarchy: Functional properties of a physical system designed to serve human purposes, described at several levels of abstraction (Rasmussen and Pejtersen, 1995).......................................................................... 91 Figure 24. State variables associated with a heat exchanger at different levels of abstraction. ................................................................................................................................. 92 Figure 25. Schematic map of the sequence of information processes involved in a control decision (Rasmussen and Goodstein, 1986).......................................................... 96 Figure 26. The decision-making activity of the supervisory controller in monitoring the heat exchanger. ............................................................................................................. 100 Figure 27. Schematic representation of the operation of power steering in a car................. 100 Figure 28. Model Human Scheduler (Sanderson, 1991).......................................................... 103 Figure 29. Computer support for rule-based decisions. .......................................................... 111 Figure 30. Facial features recognisable in the context of the whole face (based on Solso, 1979). ..................................................................................................................... 115 Figure 31. Parallel machine, makespan problem displayed as a multiple bin, minimum height problem...................................................................................................... 118 Figure 32. Organisation of a composite symbolic display for an industrial plant. (Rasmussen, Pejtersen and Goodstein, 1994, adapted from Lindsay and Staffon, 1988). ... 130 Figure 33. The coolant circulation in the primary and secondary circuits............................ 131 Figure 34. The propagation of a disturbance through a thermodynamic system (based on Rasmussen, Pejtersen and Goodstein, 1994)...................................................... 132 Figure 35. Constraints on the thermodynamic cycle. .............................................................. 134 Figure 36. Original configuration of machines. ....................................................................... 147 Figure 37. The configuration of the machines during intensive field study........................... 148 Figure 38. Major physical functions in printing. ..................................................................... 149 Figure 39. Major physical resources in printing...................................................................... 150 Figure 40. Ends-Means relationships between physical functions and physical resources. . 151 Figure 41. Statistical profile of resources used in processing the jobs listed in the database. ............................................................................................................................... 152 Figure 42. Job Tag...................................................................................................................... 158 Figure 43. Estimated versus actual manufacturing times. ...................................................... 161

x

Figure 44. The scheduling goal structure. ................................................................................ 170 Figure 45. Operational steps to realise goal 2A, “The minimisation of press set-up time.”. 177 Figure 46. The goal structure including the subgoals of 2A.................................................... 186 Figure 47. The relevant goals for each step in the scheduling process for Akira 1............... 188 Figure 48. The relevant goals for each step in the scheduling process for Akira 2............... 189 Figure 49. The relevant goals for each step in the scheduling process for Akira 3............... 190 Figure 50. The relevant goals for each step in the scheduling process for Akira 4............... 191 Figure 51. Actual production 28/5/92 (Thursday). .................................................................. 196 Figure 52 Actual production 29/5/92 (Friday) ......................................................................... 196 Figure 53 Actual production 30/5/92 (Saturday) ..................................................................... 197 Figure 54 Actual production 31/5/92 (Sunday) ........................................................................ 197 Figure 55 Actual production 1/6/92 (Monday)......................................................................... 198 Figure 56 Actual production 2/6/92 (Tuesday)......................................................................... 198 Figure 57 Constraints define the scheduling process............................................................... 201 Figure 58. Manufacturing times of jobs processed during the field study............................. 202 Figure 59. Distribution of manufacturing times. ..................................................................... 202 Figure 60. Formulae for determining length of the production day and due date in simulated time and conversion of a job’s actual completion time to simulated time....... 206 Figure 61. The arrival and due times of jobs processed during the intensive study. ............ 207 Figure 62. The distribution of slack time on arrival................................................................ 209 Figure 63. Ends-Means relationships between physical functions and physical resources. . 224 Figure 64. State variables associated with a heat exchanger at different levels of abstraction. ............................................................................................................................... 227 Figure 65. Kinsley’s (1994) Abstraction Hierarchy for an Advanced Manufacturing System. ............................................................................................................................... 232 Figure 66. Work Domain Analysis for Scheduling the Akira presses at Melamed............... 234 Figure 67. Feasible means-ends links for a particular job specification. ............................... 235 Figure 68. Structural sequence of Abstraction Hierarchies.................................................... 237

xi

Figure 69. The means-ends chain that is instantiated is shown by the links at a composite level of granularity. .............................................................................................. 238 Figure 70. Model Human Scheduler (Sanderson, 1991).......................................................... 240 Figure 71. The scheduling goal structure. ................................................................................ 244 Figure 72. Activity as action sequence related to the work domain (shown in part at top) and activity in decision terms (shown as two decision ladders at bottom). Diagram shows a case where a function is effected by activity that is shared between human and system (Sanderson, 1998) ................................................................ 245 Figure 73. A Decision Ladder for landing a light aircraft, showing the prototypical activity associated with pre-landing checks (left) and maintenance of the descent profile (right) (Lintern and Naikar, 1998)...................................................................... 246 Figure 74. Scheduling activity as a series of decision ladders. ................................................ 247 Figure 75. The relationship between the goal structure, decision ladder and abstraction hierarchy............................................................................................................... 248 Figure 76. The cyclic unit (Volpert, 1982). ............................................................................... 250 Figure 77. The hierarchical-sequential organisation (Volpert, 1982). ................................... 251 Figure 78. The primary cyclic-unit for minimising the number of set ups on each press. ... 252 Figure 79. The hierarchical-sequential organisation of cyclic units for minimising the number of set ups on each press. The shaded triangles show the cyclic units that have formed when the scheduler’s focus is on Akira 1. .................................... 253 Figure 80. The two types of inputs in the interactive scheduling task (Bauer et al., 1991)... 259 Figure 81. Human-Computer Interactive Modes (Nakamura and Salvendy, 1994)............. 260 Figure 82. Architecture of Human Performance Model (Nakamura and Salvendy, 1994).. 262 Figure 83. Interface elements and their location in a hybrid intelligent production scheduling system (HIPSS)..................................................................................................... 271 Figure 84. The relationship between the goal structure, decision ladder and abstraction hierarchy............................................................................................................... 273 Figure 85. Overview of the Shop during schedule construction showing the Job Windows for the four presses and the unallocated jobs. ......................................................... 276 Figure 86. Jobs Window for Akira 3: Jobs and their characteristics. A partial display of objects representing job attributes. .................................................................... 279 Figure 87. Job Details observable in the Job Windows. .......................................................... 280 Figure 88. Global and Local Features....................................................................................... 282

xii

Figure 89. As a string of jobs is collected, the collected objects become shaded (labelled A) and the permissible presses become shaded (labelled C). The number of spokes on the blue ‘collector’ (labelled B) shows how many have been collected. When the ‘collector’ is right clicked, the job numbers are displayed in a pop-up box, listed in the sequential order of collection.......................................................... 288 Figure 90. Feasible means-ends links for a particular job specification. ............................... 290 Figure 91. Structural sequence of abstraction hierarchies. .................................................... 291 Figure 92. The relationship between the abstraction hierarchy and the signs in the Jobs Windows of the ProtoHIPSS. .............................................................................. 291 Figure 93 Relations represented by classes: Classes Operation and Machine represent operation and machine entities and class OperationMachine depicts the relation IS_PROCESSED_BY .......................................................................................... 293 Figure 94. Set-up time is a factor that arises when the value of particular constraints differs between abstraction hierarchies. ........................................................................ 295 Figure 95. The scheduling goal structure. ................................................................................ 296 Figure 96. Computer support for rule-based decisions. .......................................................... 297 Figure 97. Human scheduler recognising relevant policy and computer carries out the procedure. ............................................................................................................. 298 Figure 98. Schedulers’ practices commonly address operational objectives. An HIPSS should extend their interest to functional goals. ............................................................ 299 Figure 99. Display of the contribution from each job to the performance measure............. 301 Figure 100. Ends-Means relationships between physical functions and physical resources.305 Figure 101. A conceptual representation of a JSO for a group of jobs.................................. 310 Figure 102. Display of the contribution from each job to the performance measure ........... 316 Figure 103. The graphic form of the screen for the experiment............................................. 317 Figure 104. The text form of the screen for the experiment.................................................... 317

xiii

List of Tables Table 1. Differences in Human Supervisory Control between Computerised Discrete Part and Process Manufacturing (Barfield, Hwang and Chang, 1986). .................... 26 Table 2. Summary of the basis for cognitive control. Schematic illustration of the representations of the regularities behind the behaviour of the environment that are used for control of behaviour (Rasmussen, 1990). ....................................................................................................... 94

Table 3. Number of cylinder-sets for the Akira presses. ......................................................... 147 Table 4. Set-up times for Akira presses. ................................................................................... 153 Table 5 A non-exhaustive set of production rules for the technical constraints.................... 154 Table 6. The relationship between number of colours and the number of parallel Akira presses. .................................................................................................................. 157 Table 7. Colour of tag relates to the cylinder size.................................................................... 158 Table 8. Speed of Akira press. ................................................................................................... 159 Table 9. The flowtime for producing artwork and plates for jobs in the database............... 162 Table 10. Scheduling goals......................................................................................................... 171 Table 11. Realisation of goals. ................................................................................................... 174 Table 12. Subgoals of goal 2A, “Minimisation of press set-up time.” .......................................... 178 Table 13. The cylinder requirement for jobs in the database................................................. 178 Table 14. The percentage of steps for which a goal applies: per presses and per all presses. ............................................................................................................................... 192 Table 15. The percentage of steps for which a goal is violated: per presses and per all presses. ............................................................................................................................... 193 Table 16. The status of the jobs relative to their due date for the day that the schedule was constructed (Percentages are for the day).......................................................... 194 Table 17. Code for Akira-press cylinder size ........................................................................... 195 Table 18. Available time for the web presses in minutes......................................................... 201 Table 19. The length of each day for the simulation is given by the total available time for each press divided by the number of presses. .................................................... 204 Table 20 Number of Tardy jobs (modified arrival times and due date at modified 4pm) ... 209 Table 21 Average Tardiness (modified arrival times and due date at modified 4pm) ......... 209

xiv

Table 22 Number of Tardy jobs (modified arrival times and due date at end of day) ......... 210 Table 23 Average Tardiness (modified arrival times and due date at end of day) ............... 210 Table 24 Makespan..................................................................................................................... 211 Table 25. Utilisation.................................................................................................................... 211 Table 26 Abstraction Hierarchy from top-down and bottom-up perspectives (Krosner, Mitchell and Govindaraj, 1989). ......................................................................... 229 Table 27. Legend for the abbreviations for the job attributes................................................ 234 Table 28. A typology of production systems (Wiers and McKay, 1996). ............................... 264 Table 29. Legend for the labels in Figure 86. ........................................................................... 278

xv

Chapter 1 Introduction

1.1

Control of Manufacturing Systems

Production scheduling research has been conducted within three distinct paradigms, which Solberg (1989) classifies as: • The optimisation paradigm; • The data processing paradigm; • The control paradigm. The goal of the optimisation paradigm is to formulate and solve scheduling problems as mathematical programming problems. As problem formulation proved to be difficult, simplifying assumptions were introduced. However, the act of simplifying scheduling problems, in most cases removes their essence: the control of manufacture. In the data-processing paradigm, scheduling is characterised as the management of data. The relation between decisions and performance measures is not considered and data dependencies, redundant or contradictory information, and efficient data storage structures are not addressed. The control paradigm places attention on the monitoring and tuning aspects of systems. This thesis challenges the perception of job shop scheduling as a combinatorial exercise. Where there is aggregate production control and in process oriented industries, McKay and Wiers (1997) assert that scheduling theory can rightfully claim to have influenced practice and to be in tune with reality. However, they maintain that theory and practice have not been closely related in intermittent job shops or industries where there is inherent uncertainty or where human judgement is necessary. In such environments, production scheduling is generally acknowledged to be a skilled craft practised by experienced human schedulers. 1

Knowledge and intuition, gained through years of first-hand experience, are the principal tools employed by the scheduler in generating and maintaining satisfactory production schedules (Rodammer and White, 1988). Scheduling is seen to be a human activity in which schedulers work with many competing and conflicting goals to produce realistic schedules that are bound by many constraints. Scheduling under these conditions is not a computational task but: A dynamic and adaptive process of iterative decision making and problem solving, involving information acquisition from a number of sources, and with the decisions affecting a number of production facets in reaction to immediate or anticipated problems (McKay and Wiers, 1997).

The person who is responsible for developing the schedule is a member of an organisation that provides the inputs for, and requires the results from the scheduling process (McKay and Wiers, 1997). As scheduling problems are embedded in the organisation, the way the problem is solved is affected as much by social forces as the physical and temporal constraints on the manufacturing system. The organisational flow of command and the attendant flow of information across departmental boundaries, as discussed by McKay and Buzacott (1999), also affect the scheduling outcomes. However, tackling all these issues within the strict confines placed on this thesis would be impossible. Therefore, to delimit the effect of these problems, small job shop environments are only considered in which all the scheduling functions, including dispatching and production control, reside in a single person. Therefore, issues relating to hierarchical structures of planners and schedulers, which may include master planners, master schedulers and departmental schedulers, are not addressed in this thesis. Larger organisational issues are therefore removed and the issues associated with shopfloor scheduling per se are the focus of the discussion. The focus of the research is on the development of a hybrid intelligent humancomputer paradigm for job shop scheduling that supports human schedulers operating in environments characterised by uncertainty and instability.

2

1.2

Research Question

The major question posed by this research has the following logically linked components: 1. What are the scheduling strategies employed by human schedulers in job-shop scheduling? 2. How are these influenced by the abstractions employed in the data presented to the scheduler about the state of the system and its performance requirements? 3. What are the most appropriate formalisms for developing a hybrid intelligent human-computer production scheduling system?

1.3

Thesis Overview

In Chapter 2, the foundation for the thesis topic, a hybrid intelligent humancomputer paradigm for job shop scheduling, is set through a model of Hybrid Human-Computer Intelligent Automation (HHCIA), which combines human and machine ‘intelligence’ to control a manufacturing system, whether continuous or discrete. Two distinct phases of control in HHCIA are delineated: the control during the processing of a batch of work and the control of the changeover of batches. It is argued that while supervisory-control models have been well developed for process control, there is a dearth of models for the human-computer control of changeover of batches. While both phases are present in all manufacturing environments, in discrete manufacture the planning of the changeovers is shown to be central. As the job-shop environment is perceived to be most difficult to schedule, it is used to explore a hybrid intelligent humancomputer paradigm for the scheduling component of HHCIA. In Chapter 3, the potential for hybrid intelligent production scheduling is explored. Job shops are shown to be dynamic environments that are stable for only short periods. The intractable nature of production scheduling, the limitations and problems of algorithmic and knowledge-based solutions, and the consequential dearth of computer solutions are discussed. Scheduling activity is shown to be the management of constraints to satisfice desired outcomes. To produce feasible schedules, schedulers have to relax some constraints. Then,

3

scheduling problems are shown to have two dimensions of problem difficulty: complexity and perplexity. Combinatorial complexity arises in the classical operations research (OR) representation of scheduling problems. It is asserted that the OR approach relaxes constraints so heavily that a problem’s representation often becomes far removed from the reality of the shopfloor. In contrast, practising schedulers only lightly relax constraints to the degree necessary to obtain feasible schedules. Schedulers, it is maintained, operating in a perplex world of uncertainty and instability, can produce schedules where there are unexpected events, incomplete and ill-defined data and multifarious factors. It is argued that a hybrid intelligent production scheduling system (HIPSS), in which there is active engagement of humans in decision making, brings together the strengths of traditional operations research, knowledge-based systems and human-centred approaches to scheduling. It is maintained that this approach accommodates the complexity and perplexity found in real manufacturing establishments. In Chapter 4, scheduling is placed within a systems-thinking context in which human schedulers make decisions through purposeful rational action. As manufacturing systems are products of purposeful action, it is argued that they can be modelled using means-ends relationships. The construction of a means-ends hierarchy for the work domain and an activity analysis of scheduling behaviour provide a framework for developing an HIPSS. It is contended that schedulers’ perception of a manufacturing system varies across a means-ends hierarchy and is dependent on the type of reasoning they invoke. The case is presented for the reasoning of schedulers as being skill-, rule- or knowledge-based. It is asserted that scheduling activity consists of cycles of recognition and action that are interrelated through Rasmussen’s decision ladder. Scheduling activity is seen as the placement of intentional constraints on the behaviour of the manufacturing system. It is argued that an HIPSS has to support schedulers’ attunement to the constraints of the physical system at the different levels of abstraction in the means-end hierarchy. To effectively understand human-computer interactions, one must first understand how a human scheduler solves scheduling problems. Chapter 5 discusses how a scheduler at a small job shop, operating within the bounds of many constraints, 4

produced schedules using a machine loading board. By directing his attention to the constraints and the immediate target goals, he is shown to have produced workable schedules. The many goals he sought and the operational policies he applied are identified. The goals are consolidated into a structure that links goals at various levels of abstraction. While the goal structure is for a particular domain and scheduler, it provides a basis for discussing the architecture of an HIPSS. An ends-means description of the physical functions and physical resources of the job shop provides a partial work domain analysis. For some standard OR performance measures, the scheduler’s performance is assessed and compared to benchmark values found from a simulation of a heavily relaxed model of the job shop. From a discussion of the type of scheduling strategies employed by the scheduler, inferences are drawn regarding the decision-making activities of human schedulers in discrete manufacture. In Chapter 6, the work domain analysis and activity analysis tools of Cognitive Work Analysis (CWA) are applied to the data from the field study. The available CWA tools are found inadequate for representing human decision-making processes in discrete-event systems. New tools are developed to extend the current formalisms that include the goal structure of Chapter 5, multiple decision ladders and abstraction hierarchies. The dynamic aspects of goal setting and attainment are represented using the cyclic units of Action Regulation Theory, developed by German work psychologists. These methodological tools provide the basis for developing an HIPSS that can support schedulers in perplex environments. In Chapter 7, the form of an HIPPS is explored. The chapter begins with a discussion on interactive human-computer scheduling systems. It is argued that these systems fail as scheduling tools because the decision-making architecture on which they are constructed is flawed. An alternative architecture is proposed that locates the scheduler centrally in the decision-making process. Using the findings from the Cognitive Work Analysis of scheduling at Melamed, undertaken in Chapter 6, the way an HIPSS can be designed is then demonstrated. Finally, the methodological contribution of an HIPSS to scheduling practice is discussed.

5

Chapter 2 Hybrid Human-Computer Intelligent Automation of Discrete Manufacture

The purpose of this chapter is to develop an understanding of what ‘hybrid’ automation is, to locate it in the context of discrete manufacture, and to show that for discrete manufacture production scheduling is central to it. ‘Hybrid’ here signifies human and machine intelligence combining cooperatively (Karwowski, 1988). The continuous process industry has long-term experience in controlling automated processes. This is especially so in nuclear power generation and petrochemical production. Therefore, the extensive research into human supervisory control of these automated processes may provide insight into how human and machine intelligence may combine in discrete manufacture (Edwards and Lees, 1973; Sheridan and Johannsen, 1976; Umbers, 1979; Bainbridge, 1983). Therefore, the exploration begins with a discussion of the features of supervisory control. The supervisory paradigm is shown to have two distinct forms of control that are critical components for categorising the significant differences between the control of continuous and discrete manufacture. This leads to a definition of hybrid human-computer intelligent automation and the centrality of production scheduling in hybrid automation of discrete manufacture.

2.1

Supervisory control

In continuous processes, there are various degrees of automation. The level of control in a primary control loop can range from fully automatic with human intervention only when the system is not behaving correctly to semiautomatic control relying on human activity to close the loop, as depicted in Figure 1. Many

6

primary control loops may be present, with the degree of control varying across processes. For each loop, the set point fixes the desired value of its output. There may be either automatic or manually setting of a set point’s value. Figure 2 shows a primary loop’s set point governed by two other processes and human intervention. Leaving human intervention aside, the set point is a weighted sum of the outputs of the two other processes. For situations in which a human + closes a control loop, researchers process output Σ setpoint have applied control-theoretic models to human behaviour (Newell and Simon, 1963; controller Sheridan, 1976). Their interest is (a) Fully Automatic in the dynamic behaviour of human operators, represented by + process output Σ setpoint linear or quasi-linear relationships. They treat the human operator as an element that processes information to control or make decisions in a closed-loop (Pew and Baron, (b) Semi-Automatic 1983). Through feedback, Figure 1. The primary loop in process regulation. operators compare actual responses to those predicted or desired. Pew and Baron emphasise that this view of the operator is far more general than that of a simple error-correcting device. The manual control model is the most direct application of engineering methods. A driver steering an automobile is an example of the human operator being in the primary control loop. A mathematical expression represents the driver’s psychomotor skills, with human senses providing the input and output links to the system (Chubb, Laughery and Pritsker, 1987). Other engineering methods that are embraced are signal detection theory, optimal control and Bayesian decision making; for further discussion see Chapter 4. Their form may be either classical invariant form or optimal. The models may be compensatory, pursuit, review or

7

precognitive (Evans and Karwowski, 1986). There is no intent to accurately describe the real internal processes occurring in humans when acting as control elements. Control-theoretic models focus on the engineering problem, without concern for a complete understanding of the human psyche and functional performance. Their critical property is that they prescribe the normative behaviour expected of humans (Pew and Baron, 1983). Mathematical relationships, found by observing only some human operators, become the expected, normal criterion of performance for all controllers. Modelling human performance for + output 1 process 1 Σ setpoint 1 the situation in Figure 2 is more problematic. On observing the response of the system, humans controller 1 k1 may adjust the set point to obtain + better dynamic performance (e.g., + + Σ process 2 output 2 Σ improve stability, modify the rate + of change of an output). A control-theoretic model may well controller 2 suit these circumstances. Often, k3 though, control-theoretic models + process 3 output 3 Σ setpoint 3 do not offer a satisfactory explanation. Other reasons may guide the human in adjusting the controller 3 set point: personal or Figure 2. The set point for process 2 is organisational goals for example. influenced by the output of other processes and human input. Models that attempt to explain the human processes in making decisions may be more efficacious (Rasmussen, 1986). Some human process models attempt to represent the operator’s basic cognitive functions: short-term memory models, visual scanning and detection models, attention models and workload models. Others focus on decision-making behaviour: industrial inspection models (Drury and Prabhu, 1994), fault-diagnostic models and human operator simulation models are illustrative.

8

While control models may aptly represent fast-acting control tasks such as flying, they may be quite inappropriate for slow-changing industrial processes (Bainbridge, 1981). Where changes are slow, human operators may spend considerable time on activities other than direct control. Control decisions therefore only make up a minor part of their decision making activities (Beishon, 1974). In such circumstances the human is acting as a supervisory controller who monitors and troubleshoots the industrial process in which the computer performs most direct-control tasks (Sanderson, 1988). The human supervisor, for example, may spend considerable time setting goals. Bainbridge even questions the applicability of control-theoretic models under circumstances where they seem pertinent. Control models often assume that the supervisory controller uses process knowledge to predict the effects of alternative actions. She argues that supervisory controllers of industrial processes ordinarily do not obtain desired system states by “parameter trimming”. Instead, they access mental images of permanent look-up tables of state-action information. From statements by supervisory controllers about their behaviour (collected through verbal protocols), Bainbridge (1981) puts forward an episodic account of their activities. She found that they used goals to guide their search of conditional propositions about potential process behaviour. Sheridan constructs a general model of supervisory control that does not constrain modelling to a single perspective. It is an overarching model that accommodates multiple perspectives for viewing supervisory tasks. Using the strictest meaning he assigns to supervisory control, the function of humans is H uman to supervise manufacturing processes that are self-regulating o perato r (Sheridan, 1987). Humans outside the primary loop, set the D isplay C o ntro ller initial conditions, make adjustments intermittently, and C o m puter receive information about the process (Figure 3). This meaning accords with Moray’s (1986) crucial requirement S enso r A ctuato r for supervisory control: the presence of a mediating intelligence between operator and task that is capable of T ask autonomous decisions. A broader meaning applies to Figure 3. processes that are not self-regulating. Feedback, if at all Strictest sense of supervisory needed, is only by way of the human operator. The control computer acts as an interactive agent between the human (Sheridan, 1987). 9

and process (Figure 4). It transforms data into displays that suit human apprehension, and transforms operator commands into control actions. The computer has controlling and displaying functions. In carrying out the supervisor’s commands, the computer may do some part of a task entirely, leaving other parts to the human, or it may provide some control compensation to ease the whole task for the human. It may integrate and interpret incoming information or give advice to the supervisor as what to do next. Humans undertake multiple tasks (Sheridan, 1976). In doing so they allocate their attention to various graphical or alphanumeric displays and intermittently load new programs onto the computer that directly controls a physical process. Sheridan’s (1976) general model brings together different features of human and machine activities: sensory mechanisms for acquiring information about the world, an internal cognitive mechanism for evaluating alternative responses, and mechanisms for carrying out action. The supervisory controller operates in four modes: 1. Planning 2. Teaching 3. Monitoring 4. Intervening.

Human operator

Display

Controller

Computer

Sensor

Actuator

Task

Figure 4. Broader sense of supervisory control (Sheridan, 1987).

plan

teach

Monitor Intervene

As planning is anticipatory, it is self-paced and not Learn responsive to immediate control requirements. What Sheridan Figure 5. designates as teaching is the implementation of a plan to Functional and temporal reach a goal, using the computer. Supervisory controllers nesting of monitor computer displays to make sure the automated supervisory roles (Sheridan, production processes are working properly. If not, they 1987). diagnose the problems. Under emergency conditions, routine maintenance or repair, human operators intervene, thereby interacting more directly with the processes than the normal self-regulating state. By 1987 he added learning to these basic roles and ordered them into time-sequential steps with 10

nested loops shown in Figure 5 (Sheridan, 1987). Sanderson’s (1988) rewording, as follows, makes the meaning Sheridan ascribes to these operations more lucid: 1. Goal setting and planning; 2. Implementing a plan to reach a goal; 3. Monitoring the process; 4. Intervening when needed; 5. Learning from the results of previous actions. The basic cognitive operations in supervisory control, listed above, are sufficiently broad to cover both steady state and exceptional conditions. In trying to keep a system steady, supervisory controllers have to monitor its state and intervene judiciously to compensate for the inadequacies of the control system. Besides maintaining dynamic behaviour, they also have to deal with system failure. When some part of a system fails, supervisory controllers act to recover from the failure. Their action may include collecting information, testing hypotheses and making repair attempts (Bereiter and Miller, 1988; Morris and Rouse, 1985; Aström 1985; Fox and Smith 1984, Jones and Maxwell 1986). They may also have to instigate suitable actions to keep it operating while attempting recovery (Rouse, 1987). Again, these activities conform to the monitoring and intervention classification. The broad notions of monitoring and intervention are metalevel descriptions. They provide a framework for explaining behaviour. Models that offer more detailed explanations are then needed for the elemental activities. These functional models can be in a variety of forms, for example, control-theoretic or episodic. The responsibility of supervisory controllers goes beyond the immediacy of controlling active systems. Goal setting, planning and implementation of plans are operations directed towards future production. For them, time response is not critical, accordingly control-theoretic models do not appropriately represent the behaviour of supervisory controllers. These metalevel roles of the supervisor are accompanied by two other metacharacteristics: the loci of physiological function and levels of behaviour (Sheridan, 1987). The loci of physiological function are sensory (accessing displays, observing, perceiving), cognitive (evaluating the situation, accessing memory, making decisions), and response functions; Newell and Simon (1972)

11

approach problem-solving from this perspective as does Card’s (1984) Model Human Processor. This cursory account of Sheridan’s general model of supervision shows its utility in representing supervisory control. Is the supervisory control paradigm applicable to discrete manufacture? For this question to be resolved production processes have to be seen as having two distinct phases.

2.1.1 Two Distinct Forms of Control For both continuous and discrete manufacture, the output is a batch of product. There are one or more identical items in each batch. The notion of a batch may be unclear for continuous production. For our purposes, it will be the amount of product in one continuous run. The change of product or start up after shutdown begins a new batch. For its production, a batch makes use of one or more machines. For easy of discussion, the term “machine” will be broadly used and will encompass work stations at which humans provide all the productive input (e.g., manual assembly). In both discrete manufacture and continuous processing two separate phases are recognisable at a machine: the processing of a batch and changeover of batches. The function of supervisory controllers is different for each phase. The processing of an order in a flow-shop can illustrate the primacy of the phases. The order consists of a batch of distinctly separate items. This “order” batch is split into smaller batches, called transfer batches. Each transfer batch moves through a planned sequence of machines. At each machine all items in a transfer batch are completed before the batch moves to the next machine. Splitting the order into transfer batches allows the order to overlap machines. Assume that there is no delay between a transfer batch finishing on one machine and its loading on the next. The machines are consequently tightly coupled. Now consider something amiss at a machine. Performance degrades. For the transfer batch to finish by the expected time, the average cycle time for subsequent items in the batch has to reduce. Obviously, decreasing the size of the transfer batch statistically decreases the opportunity to reclaim lost time. For the extreme case of

12

a batch size of one, there is no margin for catching up on lost time outside the current cycle. What happens if a perturbation in performance at a machine cannot be rectified by the batch’s planned completion time? If the transfer batch overruns its planned time at the machine, it will be late starting at the next machine. Also, there will be a delay in freeing the current machine for the next batch. If time is not allowed to overrun, then the effect of the perturbation (e.g., decrease in quality) propagates to the next, and possibly subsequent, machines. In the terminology of socio-technical systems, the variances are exported (Mumford, 1985). Variance here means the occurrence of a parameter’s statistical variance being outside acceptable bounds. Thus, decreasing the size of the transfer batch increases the likelihood of variances being exported. Exported variances require increased levels of supervision. With increasing levels there is greater organisational complexity. Hence, the aim in socio-technical design is to control variances as close to their point of origin as is possible. In continuous production, when the work in progress at a machine has finished, it transfers immediately to the next machine in its production route; that is, the transfer batch is an infinitesimally small element of a continuous variable such as volume, mass or power. There is little time to mitigate the effects of any perturbation. As variances are exported from their source, coordination and control have to extend beyond the source of the variances. Discrete manufacture tends to be complex, with many activities occurring in a single operation. Setting up each activity is more difficult than adjusting flowcontrol valves typically found in chemical processes. For example, devices may have to be attached or removed. Components may have to be correctly located and clamped. It is common for operators to set up machines by hand. They have to carry out all tasks that have not been automated. Operators normally monitor operating conditions, using their senses, often unaided, as suitable devices for sensing operating conditions have been lacking. Therefore, monitoring tends to be local. When the equipment is not operating correctly, they have to intervene manually. Operators generally aim to rectify problems in a batch before it transfers to the next machine. Otherwise, the exported variances would make the already arduous task of monitoring and controlling even more difficult. The practice of having buffers of work-in-progress (WIP) between machines ensures a

13

sufficiently loose coupling of machines to allow most problems to be rectified at their source. In processing a batch at a machine, global monitoring and intervention becomes necessary when variances are exported. Because global monitoring and intervention are tractable in continuous processing (e.g., remote monitoring and control of flowrate), tight coupling between processes is common. Heavy reliance in discrete manufacture on direct monitoring and intervention by humans makes it necessary for control to be local. Therefore, it has been impractical to have tight coupling between machines. With the take up of FMS and other advanced manufacturing techniques buffers of WIP are becoming severely reduced, and consequently monitoring and intervention is having to extend beyond local confines. Let us now consider the changeover of batches. When a machine finishes working on a batch, the batch moves on and the machine becomes free. The following decisions are required: • What work, if any, will the machine undertake next? • At what time shall this work be placed on the machine? • What activities are required during changes? The change in use of a machine may require the removal of tools and other material (e.g., washing pipework), the setting up new tools and materials, and changing the machine’s configuration. All this has to be planned. In changing over batches two basic operations in the general model of supervisory control are important: goal setting and planning, and, implementing a plan to reach a goal. Planning is at two levels: the short and long term. Long-term planning encompasses strategic planning, market and sales planning, planning forecasted production, and material requirement planning (Jain, 1987). The function of planning is to design programmes of actions that on execution achieve desired goals (Le Pape, 1992). There may be more than one plan that achieves a given goal. The planning problem then consists in selecting a “good” plan. Supervisory controllers are interested in the consequences of planning

14

forecasted production, and material requirement planning. The extent of their involvement depends on the organisation’s hierarchical form. When the focus shifts from long-term plans to production over the immediate future (a day or week is typical) plans need to be in detail. Supervisory controllers need to deliberate upon the required operations, the machines that can carry out these operations, and the constraints on the ordering of operations (Hax, 1987; Tsubone 1988). They decide the processing routes, the machine allocations, the materials required, and the order of processing at each machine. To make these decisions they may need to consider available capacity, the processing time for each operation, and the time needed to configure the machine for the operation. Note that available capacity depends on the availability of overtime, and the potential for changing employee work assignments (Jain, 1987). When a machine is in the changeover phase (e.g, on completing the processing of a batch), decisions on what to do with the machine have to be enacted. If all is well, these decisions have been already been made as part of planning. If the system has not performed as planned, for example, when completion time happens to be significantly later than anticipated, supervisory controllers may have to replan the work. They may have make decisions within restrictive time constraints. While for continuous product the unit size of the transfer batch (i.e., its volume or mass) is, in the limit, infinitesimally small, the order batch is normally large. In cases where there is only a single invariant product, the “order batch” can be taken as the amount of product processed between plant shutdowns. Under these circumstances, the ratio of time spent preparing a machine for changing batches to time spent processing a batch is low. For discrete manufacture, in the extreme situation there are many highly varied products. Consider the case of the transfer batch being a single item with different products following in sequence at a machine. Then, the ratio of time spent preparing a machine for changing batches to time spent processing a batch becomes high. The more frequent are the changes in batches at a machine, the greater the frequency of activities to set up the machine, and hence the greater the likelihood of mishap. Under discrete manufacturing conditions in which there are frequent changeovers, supervisory controllers are more likely to have to carry out rescheduling activities

15

than would be the case of continuous processing. In the most extreme environment, the job shop, it is common for new unplanned jobs to be released for production with little lead-time. Where this occurs there are greater demands placed on persons responsible for short-term planning.

2.2

Comparison of Continuous and Discrete Manufacture

Controlling discrete manufacture is not the same as controlling continuous processes. The discrete control task is different from continuous process control, though it does share certain important characteristics (Bereiter and Miller, 1989; Hwang, Sharit, and Salvendy, 1983; Smith and Crabtree, 1975). For a comprehensive review see Sanderson (1989). To assess whether knowledge of human supervisory control is transferable from continuous to discrete systems, the characteristic differences between the two domains need to be considered (Sharit, 1984). Barfield, Hwang, and Chang (1986) compare supervisory control of discrete and continuous processes. They highlight the following differences. In continuous process manufacturing, the goal is to achieve steady state. A discrete manufacturing system produces parts as entities. Supervisory control is difficult. For example, parts may travel through the system by different routes. Even identical parts may require different routes due to dynamic scheduling problems. This distinction between the maintenance of steady-state conditions and routing of parts through the system, as discussed in the last section, is the comparison of activities from the two phases of production, the processing of batches and changeover of batches, respectively. Monitoring and intervention operations dominate the maintenance of the ongoing production of a batch. Activities at the changeover of batches derive from a chosen plan to reach a goal. This plan may be one of a number of plans that meet the desired goals. Barfield, Hwang, and Chang state that the complexity of the routing makes it hard to visualise a steady-state equilibrium. What they are referring to is the equilibrium of the system, not individual machines. However, each physical activity at a machine also has transient and steady-state stages. The operating conditions for all elements of system in continuous processing are at steady state

16

for most of the time. In contrast, operations in discrete manufacture commonly have more than a single activity. Hence, the proportion of the time they spend in transitory behaviour is significant. It is easier to monitor perturbations in steady-state behaviour of a single activity than transient and steady-state responses for multiple activities. Thus, it is not difficult for a supervisory controller to monitor more than one process in steady state, and hence more than one machine. Supervisory controllers find it more difficult to spread their attention whenever many incidents of transient activity occur. Sharit (1985) questions the capability of humans to actively control discrete manufacturing systems. He compares direct and clear control of continuous processes to the indirect control in a complex discrete manufacturing environment. Supervisory controllers of continuous processes directly control system parameters and inputs. They can quickly observe the consequences of their actions on operating conditions and system outputs. While there may be many parameters (state variables) to control in continuous systems, the measures of performance tend to be clear. In contrast, many more factors normally influence the performance of discrete manufacturing systems. These factors may combine in many ways. Where sequential transfer batches at a machine are for different jobs, activities to reset the machine may be necessary. The environment becomes even more complex wherever there are large numbers of transfer batches, as is typical of discrete manufacture. The many combinations of events make for a lack of clarity in the relationships between system inputs and outputs. Highly complex systems are spawned from the coupling of indirect human control and combinatorial complexity. The factors Sharit uses in his comparison fit the two-phase classification. The maintenance of steady-state operation pertains to the processing phase of a batch. In discussing discrete manufacture he emphasises combinations of events and the routing of work. These factors have influence at the changeover of batches. The most commonly applied measures of performance, processing and completion times, concern changeover activities. Factors from the processing phase may also be used as performance measures, for example, variation in product quality.

17

The ability of humans to observe critical events decreases whenever their activities are restricted to monitoring for prolonged periods of time; their vigilance decreases (Matlin 1988). Research on vigilance began with a classic article by Mackworth (1948), reporting how airforce operators showed less accurate radar monitoring after about half an hour of continuous watch. A human’s efficiency in responding to critical system events is less likely to reduce if they are kept alert by actively intervening in system control. To maintain their alertness, Sharit (1984) argues, humans have to find work challenging. A significant part of their attention and abilities has to be in demand. He asserts that the work has to be coherent. Using Sheridan’s general model of supervisory control, he characterises coherent work in the control of continuous processes as linking the major activities of monitoring, diagnosis, intervening, programming, and planning (Sharit, 1988). The take up of FMSs by industry has precipitated vigorous interest in the role of supervisory control in discrete manufacture. Sharit (1984) sees the work of FMS operators lacking coherence, as their tasks are restricted to local monitoring and intervention. Sharit (1984, 1985) alleges that most manufacturers have developed FMSs in ways that implicitly downplay human decision-making functions. Their actions mostly result by default, when something goes wrong. When systems are operating correctly, there is little need for supervisory controllers to intervene. Their activities in the main reduce to monitoring. These conditions favour decrement in vigilant performance and lapses in attention (Wickens, 1987): however, there is very little empirical evidence for the importance of classical vigilance effects in real industrial tasks. Both FMS operators and continuous process operators intervene in response to undesirable events such as equipment failure. The response of process control operators may go beyond rectifying the problem at the particular location of the failure. They may re-route work or reassign jobs requiring the particular machine to other equipment or to a later time. FMS operators usually do not carry out such planning and scheduling activity, as the production supervisor is deemed to have the responsibility. Using the two-phase classification, the responsibility of FMS operators generally is limited to the processing phase, whereas the responsibility of continuous process operators extends to the changing-batches’ phase.

18

In discrete manufacture the supervisory control operations (monitoring, diagnosis, intervening, planning, and implementing a plan to reach a goal) normally split between machine operators and production supervisors. Production supervisors plan future production mindful of production goals. These plans show the order of work loaded on each machine, and the time when each batch loads and unloads. In drawing up plans the production supervisor may have to consider, inter alia, possible configurations of machines, choice of routings, transfer batch size, operation precedence, availability of materials and the need to expedite some jobs. This requires monitoring the plant to find out whether problems have arisen that may affect the availability of machines or the progress of work. Such monitoring allows production supervisors to evaluate the status of the production environment. If actual production has deviated from that planned, they may amend the plan. Machine operators, at the changeover of batches, undertake activities needed to accomplish the plans drawn up by the production supervisor. Determination of the activities needed for changing batches may be left to the machine operators, or they may be partially or completely detailed in advance by the production supervisor or other planning personnel. Other personnel may be involved where materials have to be ordered, operating procedures to be worked out and machines programmed. Machine operators carry out these changeover activities and maintain ongoing processing. Like FMS operators, the activities of ordinary machine operators are limited to monitoring, diagnosis and intervention. They carry out monitoring, diagnosis and intervention activities at the local level. They are less likely to suffer from a decrement in vigilance if they undertake productive activities, for example, manually loading machines. In contrast, supervisory controllers of continuous processes monitor the progress of work through the whole system, intervening whenever necessary to maintain operating conditions. They decide the order of work to process through the system. Wherever alternative processing paths are possible, they decide the routing of the work. As their work covers both the processing and changeover phases, it is likely to be quite coherent.

19

2.3

Hybrid Intelligence

2.3.1 The Association with Supervisory Control A supervisory controller that has human and Human operator Human machine intelligence combining cooperatively interactive 10 9 subsystem exhibits hybrid intelligence (Barfield, Hwang, Displays Controls (HIS) 8 and Chang, 1986). Sheridan’s schema shown HIS Computer 6 7 in Figure 3 is useful for drawing out the features of a hybrid system. He augmented this 2 SemiTIS Computer automatic schema with loops that show supervisory 3 subsystem Sensors Actuators (TIS) control activities (Figure 6). Loops 1, 2, 6, and 4 5 7 relate to the human directly observing, 1 Task indirectly observing, directly manipulating and indirectly controlling the process. Indirect Figure 6. Multi-loop model of supervisory control (Sheridan, observation and control of each physical task 1987). are via the human-interactive system’s (HIS) computer communicating with the task’s computer controller. Of particular interest to hybrid intelligence are loops 8, 9 and 10. Loop 8 shows the human operator getting feedback from within the human interactive system. Feedback occurs in editing a program, running a planning model, etc. Loop 9 refers to two functions. Humans orient themselves to the means for indirectly controlling the process, and they adjust control parameters. In loop 10 they either orient themselves to the means for indirectly displaying information on the process and they adjust the display parameters. The features of hybrid intelligence are discussed in Chapters 4 and 7: nevertheless, at this stage of the discussion we would expect them to be related to the activities associated with loops 8, 9 and 10. To share decision making with the computer Sheridan’s human operator has to be able to interact with it. Symmetry exists in their partnership (Figure 7). Replacing “displays” with the more general term “presentation”, the human mirrors the features in Sheridan’s representation of the computer. The computer presents information to the human by displaying it on a screen. Humans present information to the computer, using their hands to

20

manipulate the computer’s keyboard and mouse. Through the computer’s controls interface, the human adjusts parameters that control the computer’s decisionmaking activities. Similarly, the computer may suggest that humans adjust parameters they use in their decision making. Loop 8 in Sheridan’s schema pertains to the editing and running of computer programs and consequently does not directly control any physical task. The activity fits Sheridan’s teaching mode. In this mode Sheridan includes estimating what the computer knows of the situation and deciding how to instruct the computer to instruct the task level computers to execute actions. It is the loop that is vested with the computer’s decision-making activities. As human supervisors mirror the interactive computer’s representation, one would expect a similar loop in their representation. Activities associated with these loops interact in some way that fuses expert systems, traditional algorithms and human-decision-making (Tabe and Salvendy, 1988). Locating loops that cater for this interaction has to be posited on a more detailed understanding of the decision-making processes than we are exploring here. As this step is broad and treacherous, a doubled-headed arc in Figure 7 indicates interactive decision making, without specifying the process. Combining computer intelligence with human intelligence is particularly pertinent to discrete manufacture. No control systems or expert systems can handle all abnormalities, or changes in production requirements (Ammons, Govindaraj, and Mitchell, 1986; Dunkler, Mitchell, Govindaraj, and Ammons, 1988; Ranky, 1986). Contingencies that cannot be foreseen are unable to be encompassed in the design of computer programs. By integrating the resources of analytic models, AI (artificial intelligence) and human decision-making, the capabilities of each can be used. Importantly, it also allows each to compensate for the other’s limitations (Nakamura, Shin, and Salvendy, 1991).

21

2.3.2 Hybrid Intelligence in Planning Sheridan and other researchers (Bereiter and Miller, 1988; Rouse, 1987; Morris and Rouse, 1985; Aström 1985; Fox and Smith 1984, Jones and Maxwell 1986) have intensively studied the interaction between operators and the machines they attend (see p. 11). Controlling the dynamic operation of physical systems interests them. In manufacturing systems this equates to maintaining the dynamic performance during the processing of a batch.

Human Supervisor Presentation

Controls

Presentation

Controls

HIS Computer

Figure 7. Human interactive subsystem (HIS).

During the processing phase, if the processes are highly automated, human activity reduces to overseeing (Ammons, Govindaraj, and Mitchell, 1988). Monitoring predominates. When necessary, human supervisors intervene, to tune parameters or compensate for deficiencies in the automated control. In studying human supervision researchers’ interests lie in sensory, cognitive and response performance. In discrete manufacture studies on supervisory control have concentrated on flexible manufacturing systems (Nakamura and Salvendy, 1987; Mitchell, 1987; Mitchell and Miller, 1986; Sharit, 1984). There has been much less research into combined human-computer decision-making in planning the activities associated with the changing of batches at manufacturing resources. Decisions to maintain the processing of a batch and those associated with the changeover of batches differ in type. Therefore, planning activities need to be disentangled from control activities. The scheduling of many transfer-batches, in an environment in which changeovers occur frequently, is extremely difficult. In Chapter 3 the intractable nature of scheduling, and the consequential dearth of computer solutions, are discussed. Production supervisors cannot avoid the intractable. They must schedule production. Compelled to come up with techniques that produce workable schedules, they rely on their knowledge, gained through years of experience in the work environment. This tacitly acquired knowledge, which is a matter of habit,

22

experience and practice rather than consciously elaborated principle or procedure.1 While control during processing a batch often resides at the resource level in discrete manufacture, the planning of changeovers is at the aggregate level: the cell, the shop or the plant. Supervisory control in planning may take various forms; Nakamura and Salvendy Computer Human (1994) see three ways to share • filters any poor • offers a solution decisions between been machine solution • replies to a question ---------------------------humans and computers • explanation • consultation (Figure 8). Hybrid intelligent decision-making is one of (A) Algorithm and Knowledge-based System these. What distinguishes Computer Human hybrid-intelligent planning • offers a solution • displays the results of from the other forms? The data processing • evaluates a solution ---------------------------“manual system” has the • look-ahead human as the planner. The (B) Manual System computer provides suitable displays of information. From Computer Human the viewpoint of Sheridan • offers a solution • offers a solution • evaluates a solution • displays the results of multi-loop model (Figure 6), data processing • modifies a solution the computer in the humaninteractive subsystem displays (C) Hybrid Intelligent System underlying data in forms that Figure 8. Human-Computer Interactive Modes suit human apprehension. In (Nakamura and Salvendy, 1994). the “algorithm and knowledge-based system”, humans monitor computer-generated solutions. They only intervene to filter out poor solutions. In the “hybrid intelligent system”, humans and computers both offer solutions. Decisions come through a

1

Polanyi (1962) describes the acquisition of tacit knowledge as the process of gaining skills that cannot be specified in detail and cannot be transmitted by prescription, since no prescription for it exists. It passes on only by example from master to apprentice. By watching the master and emulating his efforts the apprentice unconsciously picks up the rules of the art.

23

cooperative process. Nakamura and Salvendy use of hybrid is in line with Barfield, Hwang, and Chang’s (1986) definition. The use of “hybrid” as the term for a problem-solving process that combines human and computer intelligence is not universal. Other researchers use joint (Woods, 1986), cooperative (Eggleston, 1987), or even supervisory, (Mitchell, 1990; Sheridan, 1987) for this process. The cooperative process of decision making in Nakamura and Salvendy’s (1994) hybrid intelligent model combines the features of the human, the OR (operations research) model and the AI model. In response to unanticipated events or changing manufacturing environments, it provides the “best” solution, by combining the capabilities of the human and computer. The capabilities of each component are as follows: 1. Human: Experience; pattern recognition; inferential decision-making; etc. 2. OR model: Optimal solution based on a particular measure and rigid preconditions. 3. AI model: “Good” or “better” solution, based on rules extracted from the human expert. There is no guarantee that all the rules represent the “best” solution behaviour. Some knowledge gained from supervisory control of the processing phase may apply to decision-making in the changeover phase. For example, rescheduling is akin to compensation in the domain of planning. It may be invoked under abnormal or emergency circumstances (e.g. sudden loss of a machine due to breakdown), or, a when substantial deviation from the state variables requires intervention. While compensation is generally applied in continuous production to draw a process back to a desirable operational state from which it has deviated, rescheduling is applied in discrete manufacture to produce a new schedule. Changes are made to one or more of the following: the sequence of jobs, the sequence of operations for a job, the machines allocated to the operations, or the job completion times. The system, therefore, does not revert to the formerly desired state. As is the case for compensating for component failure, rescheduling to accommodate the loss of a machine may require human operators to plan for operation of the system in a degraded mode.

24

2.4

Hybrid Intelligent Scheduling

At a machine, batches load and unload according to a planned schedule. Scheduling of real production systems is problematic with automated techniques being elusive. Algorithmic and knowledge-based scheduling have too many constraints to stand alone without human intervention — the problems of scheduling are extensive discussed in Chapter 3. The drawbacks of control algorithms and expert systems have led researchers to postulate the need for an active, ‘engaged’ human operator (Sanderson, 1988; Ammons, Govindaraj, and Mitchell, 1986;, 1988, Johnson and Wilson, 1988; Sharit, Eberts and Salvendy, 1988; Tabe and Salvendy, 1988). Active engagement means intimate involvement in decision making. Although the human-computer interactive approach is more powerful than either one alone, the quality of the solutions depends upon the human’s abilities (Nakamura, Shin, and Salvendy, 1991). Hence, the computer needs to present information in a way that favours human endowments thereby improving their problem-solving performance. Humans bring useful information processing skills to supervisory control: in particular, inductive-logic and pattern-recognition capabilities. In discrete-parts manufacturing systems, unpredictable events such as machine breakdowns and changes in priorities can occur frequently. These create system states that might be difficult for the computer to control, given its predetermined logic structure, but perhaps quite simple for the human to manage (Sharit, 1984). In a hybrid intelligent production scheduling system (HIPSS) the computer helps humans in their decision making. An important requirement of “hybrid” automation systems is that the automated processes be made accessible to the human operators in a way that is consistent with human cognitive processing. The human may operate with quite different constructs from the automated algorithms, but be required to monitor and/or take over from them under conditions of uncertainty and rapid change. While activities and events in a manufacturing system may occur in a deterministic way, their complexity makes it impractical to explain and predict performance by functional analysis. To be able to dynamically maintain an optimal schedule, methods for estimating and identifying the manufacturing 25

system’s state are required. These methods are based on normative behaviour. This constitutes a problem due to the inability to adequately respond to unforeseen events. Table 1. Differences in Human Supervisory Control between Computerised Discrete Part and Process Manufacturing (Barfield, Hwang and Chang, 1986). Variables Considered Direct vs. indirect control

Discrete-Parts Manufacturing

Process Industries

Almost exclusive reliance on

Control is more direct, and there is

indirect control

a greater reliance on manual control

Qualitative assessment vs.

Assessment of system is more

Assessment of system status is

quantitative assessment

qualitative; the emphasis during

often based on magnitudes and

perception is on pattern recognition

rates of change; the emphasis during perception is on detection

Predictability

Effect of plant size

Predicting the effects of control

The more direct relationship

actions and anticipating time lags

between control inputs and system

between these actions and system

outputs allows for more accurate

response are difficult

and reliable prediction

Larger and/or more complex

The effect of larger and/or more

systems can totally alter the control

complex systems on control

strategies utilised

strategies is usually more limited in scope

Effect of output

Severe disturbances to the system

Severe disturbances often have

do not necessarily jeopardise the

serious effects on the entire process

overall functioning of the system Control Strategy

In formulating ongoing control

System output is generally used to

strategies, system output is used

update the parameters associated

more heuristically

with the relationship between inputs and outputs

Table 1 shows the differences between human supervisory control of discrete manufacture and the process industry as drawn by Barfield, Hwang, and Chang (1986). From the planning and scheduling perspective, the important features of

26

this table are the form of assessment of performance and plant predictability. Quantitative measures for assessing the state of the production system and for predicting performance are inclined to be impracticable for discrete manufacture. Production supervisors instead rely on recognising familiar patterns and using inductive logic to assess the system’s state, and consequently, anticipating the effects on system performance.

2.5

Summary

The purpose of this chapter is to develop an understanding of what ‘hybrid’ automation is, to locate it in the context of discrete manufacture, and to show that for discrete manufacture production scheduling is central to it. Hybrid human-computer intelligent automation was located within the human supervisory control paradigm of continuous processes. A general supervisory control model allows different methodologies to apply to different aspects. Sheridan uses a three-dimensional array of roles of the supervisor, loci of physiological function and levels of behaviour. Activities associated with maintaining performance in the processing of a batch were shown to be distinctly different to the activities associated with the changeover of batches. In discrete manufacture these two phases of production are generally separated. The production supervisor is generally responsible for planning the changeover of batches and the machine operator controls the processing of a batch. As the changeovers of batches are much more dominant in discrete manufacture than continuous manufacture, production scheduling is central to hybrid humancomputer intelligent control of production. Therefore the author’s contribution to research on hybrid human-computer intelligence focuses on hybrid humancomputer intelligent scheduling. In the next chapter argumentation is presented for the application of hybrid human-computer intelligence to scheduling jobs in discrete manufacture. To effectively understand human-computer interactions, one must first understand

27

how the human operator solves scheduling problems (Nakamura and Salvendy, 1987). Therefore the subsequent chapters consider how humans and computers interact in decision-making (Chapter 4), the types of scheduling strategies that have to be accommodated (Chapter 5), the form of an HIPSS (Chapter 7).

28

Chapter 3 The Potential for Hybrid Intelligent Production Scheduling

This chapter focuses on the features of scheduling that are pertinent to the investigation of ‘hybrid’ automation. The intractable nature of scheduling, the limitations and problems of algorithmic and knowledge-based scheduling, and the consequential dearth of computer solutions, are discussed. The case for active engagement of humans in decision making associated with scheduling is then presented.

3.1

Constraints Define Manufacture

In planning manufacture, a scheduler coordinates activities within the bounds set by constraints. If there is only one possible arrangement of production that meets all constraints, then a scheduling problem does not exist. It is much more common, however, for schedulers to find that either there are no feasible schedules or there is more than a single possibility from which to choose. Some constraints need to be relaxed where the manufacturing conditions cause infeasibility. A scheduler has to decide which to relax. Where there is more than one feasible plan the problem is under constrained. A scheduler acting under these conditions has to decide which of the competing schedules will be applied. The planning process for producing a schedule described using constraints is shown in Figure 9. The over-constrained case is shown at the top. The problem becomes redefined when constraints are relaxed. Its redefinition depends upon the degree of relaxation. If the loosening of constraints results in many feasible schedules, then the problem becomes that of finding the most suitable. If the constraints are only lightly relaxed to allow one, or only a few, feasible schedule, the scheduler has to decide which to relax to produce an appropriate schedule. 29

CONSTRAINTS DEFINE MANUFACTURE Problem: no feasible schedule HEAVILY Problem: Too many feasible schedules

RELAX CONSTRAINTS

SIMPLIFY MODEL

Problem: Which to loosen SELECTION USING GOALS

Problem: Still more than one feasible schedule SELECT USING PERFORMANCE

LIGHTLY

Problem: conflicting goals EVALUATION

Figure 9 Constraints define the scheduling process

This section discusses: 1. The production process using constraints; 2. The classic Operations Research approach as fitting the heavily loosened constraints path; 3. The constraint satisfaction methodology within Artificial Intelligence (AI) as following the lightly relaxed constraints path.

3.2

Production Process as Constraints

In manufacture, jobs consist of defining attributes. These attributes are informational — graphical and textual (and perhaps verbal) descriptions of the final product. An attribute’s value can be construed as placing a constraint on the form of the product. For example, in the manufacture of pens the value of red for the attribute ‘ink colour’ constrains the colour to red and thereby defines the final form of the pen — red pen rather than, say, blue pen. The function of the manufacturing system is to transform informational constraints into the tangible product required by the customer. Besides the constraints defining physical form there are also constraints defining time. A customer requires the product at a specified time. Depending upon where the boundary of the manufacturing system is drawn, attributes may also include physical components — pre-processed work. 30

This work, likewise, is the result of the transformation of attributes into a product within the constraints imposed by physical form and time. By working within the constraints defined by the attributes, the manufacturing organisation produces an article that meets the customer’s expectation. A key constraint is the customer’s expectation on the realisation of the product: the product meets the expected due date. Other attributes place constraints on materials: their final form and shape. A manufacturing system consists of machines that also have attributes. The attributes define processing capabilities. The process of dividing jobs into operations is the act of matching job attributes to machine attributes (capabilities). The consequent operations are technologically feasible. JOB 1 (job#, pieces#, due_date, … )

M CONSIST S_OF

OPE RATION (operation_identifier#, job#, operation#, … ) 1

IS_PROCE SSE D_BY (operation_identifier#, machine#, processing time, … )

1 M ACHINE (machine#, processing speed, … )

Figure 10 Entity-Relationship for multiple operations

Operations and machines form an entity relationship (see Figure 10). Processing time is a relational attribute. A simple example is the photocopying of a manuscript. The number of copies is the number of pages, an attribute of the job. The time to produce a single copy is an attribute of the photocopier. Hence, the time to copy completely the manuscript depends on the relationship between manuscript and photocopier. In other words, the processing time is an attribute of the relation ‘IS_PROCESSED_BY’ between the entities, job and machine, as shown Figure 10 (Jackson, 1988; Hughes, 1988). From the due date (a job attribute) and the processing time (a relational attribute) a constraint is placed on latest starting time to meet the due date. The entity relationship does not show dependency upon sequential time. If the previous operation performed on the machine requires a different set up to the 31

current operation, then it must be changed. The need to set up may also define the boundary for an operation. Consider, for example, cutting sheets of paper with a guillotine. After cutting the length, the guillotine is adjusted and the width is then cut. If the adjustment time is small, cutting a sheet may be a single operation. However, if the adjustment time is large, cutting lengths would most likely be a separated from cutting widths. The separate operations would then be performed on separate machines or on the same machine at different times. The division of a job into discrete activities depends upon physical and temporal constraints. Operation Operation# Method 1 Method 2 Due date Method 3 No. parts :

OperationMachine Operation# Method 1 Method 2 Machine# Method 3 Proc time start time

Machine Machine#

Method 1 Method 2 Proc. speed Method 3 : :

finish time

Figure 11 Relations represented by classes: Classes Operation and Machine represent operation and machine entities and class OperationMachine depicts the relation IS_PROCESSED_BY

When there is more than a single operation, the due date for an operation is derived from the subsequent operation’s starting time. Where there is no competition for the use of resources, the latest starting time for the job’s next operation is the slack time remaining: the difference between time to the job’s due date and the remaining processing time (Svestka, 1988). A scheduler trying to meet a customer’s due date in a competitive environment has to consider machine availability. To determine machine availability,the times that a machine is expected to be used to process other operations have to be identified. The period of utilisation is a property of the relation between operation and machine. It cannot be represented by entity-relationships, as they are atemporal. However, object-oriented models can incorporate time. Three classes can be used to depict the IS_PROCESSED_BY relationship. Slots in Operation and Machine classes are used for the entity attributes. A third class, OperationMachine, represents the ‘IS_PROCESSED_BY’ relation (see Figure 11). Apart from slots for the relational attributes, there are also slots that locate relations within time. When an operation is allocated to a machine, an instance of the OperationMachine class is instantiated, that is, created (Figure 12). Slots for starting time and finishing time define the period that the machine is used to process the particular operation.

32

Consider the case where the scheduler wants to place operation 3 on machine 1: the scheduler proposes to instantiate Operation3Machine1 (see Figure 13). All instances that represent ‘IS_PROCESSED_BY Machine1’ will place constraints on the proposed allocation. These instances constrain the values of the slots for starting and finishing time in Operation3Machine1. If the machine can only perform one operation at a time, then the period bounded by the starting and finishing times of an operation excludes other operations. Classes

Instances

Operation

Operation1

OperationMachine

Operation1Machine1

Operation2

Machine

Operation3Machine1

Operation2Machine2

Machine1

Machine2

Operation3

Figure 12 When a job is allocate to a machine, an instance of OperationMachine is created, showing which machine is to process it and at what time

Machine1

Operation1Machine1

Operation3Machine1

Operation4Machine1

Operation3

Operation5Machine1

Figure 13 Constraints from other OperationMachine instances on Operation3Machine1

Another source of constraint is the setting up of a machine ready for processing. Assume that the starting time for Operation3Machine1 has been set such that Operation4Machine1 is its immediate predecessor. The attribute values of Operation4 will place set-up time constraints2 on Operation3Machine1 (see Figure 14).

2

If Operation3 and Operation4 have exactly the same attribute values, the set-up time constraint is zero. 33

Operation4Machine1

Operation4

PRECEDES

VARIES_FROM

Operation3Machine1

Operation3

Figure 14 The immediate predecessor to Operation3Machine1 establishes a constraining relation on Operation3

The order of operations may restrict a job’s route through the manufacturing system: there is linear precedence (Conway, Maxwell and Miller, 1967). For example, in book production, the pages must be printed before they are bound. It adds another dimension to constraints imposed by other operations. If Operation102 has to be processed before Operation103 because of technological imperatives, then the finishing time of Operation102MachineX has to precede the starting time of Operation103MachineY. The simplest manufacturing system has all jobs having the same operations and strict linear precedence. There is only a single path or route through the system. This is the case of a simple flow shop (see Figure 15). At the other extremity is the classic job shop. Operations and linear precedence differ between jobs. Jobs do not share a common route through the system.

Flow shop

Job shop

Figure 15 Comparison of the flow of jobs for flow and job shops

Detailed planning of production reduces to organising the passage of the operations for each job through the manufacturing system. A scheduling problem does not exist if this can be done without violating any constraint. While some constraints may be inviolate, for example resource constraints, others may be modifiable or able to be relaxed (Smith and Fox, 1985). For instance, if the due date has to be violated, a new due date may be able to be negotiated with the customer. If the customer accepts a modified due date without penalty, the change removes the constraint violation. Assuming that this was the only violation, the proposed schedule becomes feasible. Alternatively, the scheduler may relax the due-date constraint and allow delivery to be tardy. The customer’s acceptance of

34

the relaxation is crucial. If the customer does not care when their job is delivered (i.e., any proposed variation is within an acceptable period), then the constraint in effect does not exist. However, the customer may be highly affronted by the variation — perhaps refusing to pay or not willing to place future orders. For this circumstance, violating the constraint is unacceptable and the constraint is classified as hard. The assignment of machine time for an operation is affected by the availability of the machine. Availability of a machine is affected by: • The assignment of machine time to other operations, as discussed above; • The assignment of machine time to planned repairs or maintenance; • Unplanned down time; • Delays due to the times that are required to process previous operations overrunning their expected finish time. Besides these factors, availability can also be conditional. For example, • a machine may become unavailable for some jobs, while still being available for others, due to deficiency in tools, raw materials or human resources; • the capabilities of a machine may be increased through the temporary addition of ancillary parts, thereby increasing available resources for a particular operation; • the processing speed of machine may be increased for some operations, or overall, thereby decreasing processing time and increasing available time.

3.3

Constraint Relaxation

The primary objective of scheduling is to develop a schedule that satisfices all the constraints3. In practice, numerous constraints are common. Some can be precisely defined: a machine is capable of performing a specific operation. Others

3

Satisficing is the practice of evaluating decision choices one at a time until one was found that was satisfactory (Simon, 1955; 1960). As all choices are not evaluated, it may not be the optimum.

35

may be subject to uncertainty: processing times are estimates; material arrival times depend upon external agencies. Constraints may be hard or soft, as the discussion on due dates illustrated. The large number and variety of constraints makes the finding of feasible schedules cumbersome. Therefore, in their search for mathematical methods, operations researchers have tended to restrict themselves to manageable theoretical problems: to those that are subject to a limited number of constraints. As constraints reduce, the number of feasible schedules increases: There is an inverse relationship between the number of constraints and feasible schedules (see Figure 16).

high

PROBLEM Select most suitable schedule

Feasible schedules PROBLEM Find feasible schedule

low low

Constraints

high

Figure 16 Relationship between constraints and feasibility

36

3.4

Heavy Relaxation CONSTRAINTS DEFINE MANUFACTURE Problem: no feasible schedule HEAVILY Problem: Too many feasible schedules

RELAX CONSTRAINTS

SIMPLIFY MODEL

Problem: Which to loosen

SELECTION USING GOALS Problem: conflicting goals

Problem: Still more than one feasible schedule SELECT USING PERFORMANCE

LIGHTLY

EVALUATION

Figure 17 The path of heavy relaxation

Where there is more than a single feasible schedule from which to choose, the scheduler has to decide which to choose. The simplest modus operandi for schedulers is to place the operations on a Gantt chart in some order that meets all the constraints.4 While feasible, it may not be the ‘best’ choice. Where the problem is heavily relaxed or under-constrained from the start, the scheduler has to: 1. Formulate criteria for deciding which schedule to implement; 2. Find the schedule that best meets these criteria.

4

By meeting the constraints, schedulers who do not act systematically may still produce feasible schedules although they have placed jobs on a Gantt chart arbitrarily.

37

3.4.1 Performance Objectives To be able to choose one schedule over others, a scheduler requires some criterion on which to judge the suitability of prospective schedules. The normative approach taken by operations researchers is to formulate the criteria into a mathematical function, known as the objective function. Finding the schedule that minimises or maximises5 this function then becomes the goal. As the value of objective function serves as a measure of performance, it is also known as the performance measure. Within operations research, manufacturers’ objectives are commonly perceived to be the maximisation of customer service and minimisation of cost. What is customer service? It is indeterminate and abstruse. While meeting the customer’s expected due date has primacy, the customer’s perception of service is also important. It includes such intangibles as prompt and courteous replies to any queries they put regarding their jobs. Costs are manifold and unclear. Costs arising from not meeting a job's due date are both tangible — costs for extra clerical work and overtime — and intangible — loss of goodwill and dwindled customer satisfaction (Cheng and Gupta, 1989). Many companies do not have a comprehensive knowledge of significant costs. These may be inventory holding costs for WIP and finished goods, interest losses on deferred payments for tardy jobs, tardiness penalties, the cost of machine idleness, the cost of operator idleness and the marginal costs associated with sub-contracting the work (Yang and Sum, 1993; Browne, Boon, and Davies, 1981). Operations researchers have therefore sought other operationally-accessible measures to act as surrogates. Customer service is usually reduced to meeting due dates, as a customer is considered well served if the due date is met. If due date is not met, there needs to be a means for indicating the extent of customer dissatisfaction. Comparison of planned due dates to delivery dates gives a quantitative measure, the lateness (Sadowski and Harmonosky, 1987). Service is then seen to be a maximum when lateness is a minimum. Measure of cost is

5

Which a scheduler pursues depends upon the objective.

38

confined to the dollar value of work-in-progress (WIP) and the earnings that idle machines forgo. Minimising cost reduces to the minimisation of WIP and maximisation of machine utilisation (Emmons, 1987). To minimise idle machines, their average utilisation is maximised. Which objective a scheduler pursues is contingent upon the manufacturing environment. For heavily loaded shops, machine utilisation is very important (Morton and Pentico, 1993). Turnaround is critical for shops that compete primarily on delivery lead times. Such shops generally have ample resources to provide a quick turnaround. Accordingly, they are not normally fully loaded. Flow-based objectives befit these circumstances. Due date objectives apply where customers want reliable delivery times. For problems outside these distinctive classes, some general tendencies can be drawn. Machine utilisation and the meeting of due dates tend to be satisfied if delays are minimised. Delays are minimised by minimising objectives that depend only on processing times: Emmons (1987) listed nine, based on completion time, flowtime and waiting time. These also measure congestion in the system. These surrogate objectives are still not operationally direct. For example, to maximise machine utilisation, finding the schedule that maximises average utilisation, U , over the scheduling horizon becomes the goal. Average utilisation is found by dividing the sum of the processing times for all the jobs being scheduled by the available time (Conway, Maxwell and Miller, 1967). The available time is the total machine time available for the period. Assume all the machines are available for the total time and all jobs become available simultaneously. For finite problems the available time is then the product of the number of machines and the makespan, the time until all the jobs are complete (Emmons, 1987). The intuitive idea is that finishing the given set of activities earlier will allow new activities to be started earlier (Morton and Pentico, 1993). All scheduled jobs are not finished until the last job has finished. An important class of scheduling objectives are those that are regular measures. Under some circumstances, simpler representations can be applied so that intractable problems can be made tractable. This will be elaborated upon in the next section. There are two conditions for a measure to be regular. It must be a

39

function of completion time and it must increase, or at least not decrease, if at least one completion-time for the jobs being scheduled increases (Conway, Maxwell and Miller, 1967). Flowtime (Fj = Cj - rj, where rj is the time job j is ready for processing and Cj is the time job j has been completed) is the length of time a job spends in the system. The sum of all flowtimes is a regular measure, as are the maximum and average flowtime values. Other regular measures are functions of lateness (Lj = Cj – dj, where dj is the due date for job j) and tardiness (Tj = max{0, Lj}). A schedule optimal with respect to maximum flowtime is also optimal for maximum completion time and maximum lateness (Conway, Maxwell and Miller, 1967). Delivering jobs to customers when they expect them is the most obvious scheduling objective. Therefore, where due dates are known, schedulers prefer to seek the minimisation of average lateness or average tardiness as their primary objective. Where due dates are not specified, the minimisation of the average flowtime is a common objective, as it measures the average time jobs spend in the system. It also minimises the average WIP. Since manufacturers generally do not want jobs to finish way before their due date, average tardiness provides a better measure than average lateness (Blackstone, Phillips and Hogg, 1982). This is particularly so where late jobs have time-dependent penalties while there is no benefit in completing jobs early (Baker, 1974). While the average and maximum values of lateness and tardiness are regular measures, average and maximum earliness are not. An increase in completion time for a job may result in earliness changing from a positive value to zero. Therefore, analysis becomes more difficult. Weights can be used in objective functions to allot relative importance to jobs. Weighted tardiness, Twt, is an example of jobs differing in relative importance:

Twt = å w j T j j

where the weights, wj, indicate monetary penalties incurred for late delivery (Baker, 1974; Morton and Pentico, 1993). The values of the weights are negotiated with the customer.

40

A refinement on weighted tardiness is weighted earliness plus weighted tardiness. In Just-In-Time (JIT) environments, customers do not want jobs to be tardy but are not willing to accept early delivery. Therefore, to discourage early delivery the weights for earliness are set much higher than those for tardiness (Morton and Pentico, 1993). Otherwise, tardiness is penalised more heavily. The earliness weight may reflect the cost of holding finished jobs in store. Normally, using a single criterion in a composite measure obfuscates the effects on the manufacturing system. For example, consider a case in which a scheduler wants to maximise machine utilisation and minimise the percentage of non-tardy jobs. Assume that they are equally weighted. Suppose the composite value is 70%. Both factors may be equally 70%. Equally, 95% utilisation and 45% nontardy jobs, or 45% utilisation and 90% non-tardy jobs, would also have a composite value of 70%. For the case of 95% average utilisation predicted by a schedule, which only has estimates for processing times, the manufacturing system may encounter bottlenecks and consequently end up with far many more tardy jobs than predicted. In contrast, with 45% utilisation there is potential to add unanticipated jobs without due-date performance degrading. Therefore, for a single value for the composite, the manufacturing system could be in extremely different states. Consequently, for a scheduler, the values for utilisation and tardiness have greater import than the composite value. Under an alternative strategy, a primary goal that must be satisfied (e.g., minimise makespan) is first chosen. Over the subset of schedules that satisfy this requirement another criterion is then optimised (e.g., minimise average completion time) (Emmons, 1987). Even where there is only one factor in an objective function, a single value can represent systems with quite disparate states. For example, the same value of average tardiness may be attained for all jobs being considerably tardy and for some jobs being very tardy while others are marginally tardy. In summary, broad objectives such as maximisation of customer service and minimisation of cost are often replaced by quantifiable terms acting as operational surrogates. These are used in objective functions. An optimum schedule is sought by a search for the minimum value of an objective function. While human schedulers in practice commonly pursue more than a single objective, the use of mathematical functions to represent composite objectives is problematic. To find a 41

schedule that optimises such a function is normally unrealisable. Even where the objective has only one component, its value may not clearly define the system’s state.

3.4.2 Finding the Most Appropriate Schedule Once an objective function has been established, it has to be optimised. The problem now becomes that of seeking the schedule from the set of feasible schedules that optimises the objective function. Researchers sought solvable optimal analytic models to obtain exact solutions. However, except for some simple cases, these were elusive (Buzacott and Yao, 1986; Ben-Arieh, 1988; Newman, 1988).6 After some early successes in the 1950s and 1960s, such as Johnson’s algorithm (Johnson, 1954) for sequencing n jobs on two to three machines, it was found that even the simplest idealised problems, whilst they may be able to be formulated elegantly using integer or dynamic programming, require an inordinate amount of computation time to solve exactly. To find an optimum schedule they were forced to exhaustively search the complete set of feasible combinations. The development of the theory of computational complexity has greatly clarified the issue (Rinnooy Kan, 1976; Garey and Johnson, 1979). The vast majority of scheduling problems require some sort of combinatorial optimisation and so are generally NP-hard (Lawler et al., 1993). That is, for these problems the fastest currently available algorithms (exact solution methods) are exponential in time. In other words, the number of computations required to solve the model exactly grows exponentially with the problem size: that is, with the number of operations to be scheduled. To avoid intractable integer and dynamic programming, researchers began to develop approximate methods. They sought methods that would reduce the potential solution space and thus curtail enumeration. Branch-and-bound, an

6

The optimisation paradigm grew in the 1950s and 60s using linear programming (Solberg, 1989).

42

implicit form of enumeration, provided a means for eliminating permutations that did not approach optimality (Bauer et al., 1991). While significantly pruning the size of problems, branch-and-bound is still computationally demanding. Researchers became aware of the extent of computational complexity in the 1970s. Trying to break the nexus between the jobs and exponential time, operations researchers sought intuitive algorithms that would produce near-optimal schedules within a reasonable time. These algorithms are known as heuristic methods (Chou, Jeng and Jeng, 1988; Solberg, 1989). Heuristics are simple algorithmic procedures that dramatically limit the search for solutions in large problem spaces (Barr and Feigenbaum, 1981). They may be viewed as information processors that deliberately but judiciously ignore certain information. The art of heuristic design lies in knowing exactly what information to ignore (Bartholdi and Platzman, 1988). The difficulty with applying heuristics to scheduling problems is that it is very difficult to decide which information to ignore. The loss of information takes place in two stages. Firstly, in order to build an operations research model, some aspects of the real problem are ignored. Set-up times, for example, are assumed predictable, even if they depend on the current configuration of the machine requiring to be reset. Goals are simplified to one or two elements (machine utilisation and average tardiness, perhaps), even if the scheduler’s goals in practice incorporate many more elements. Secondly, heuristics, that only use a very restrictive set of information, are applied to the simplified model. An example is a ‘greedy’ (or myopic) type of heuristic that only considers one step ahead. While it is easy to implement, it ignores everything that happens after the first step. Heuristics, generally, can be shown to be polynomial in time but their effectiveness (near optimality) has to normally be demonstrated empirically (Foulds, 1984). A heuristic is considered useful if it offer good solutions most of the time. There are two broad approaches to heuristic scheduling: priority rules and search. A priority rule sets the relative priority of each job using a simple algorithm. From comparative tests and experience, researchers and practitioners have found that particular priority rules tend to satisfy certain performance objectives. This is 43

explored further below. Search techniques seek improved performance by modifying the priority order of jobs iteratively. They start with an initial schedule, which has been created, in general, by a simple priority rule. In neighbourhood search, the place each job has in a list is then changed in a systematic way. For example, adjacent jobs are swapped. Restricting the number of combinations drastically reduces the search space.

3.4.3 Heuristics Search heuristics generate many solutions in the pursuit of an optimum schedule. Unlike branch-and-bound, their drastic pruning of the search space results in approximate solutions, which are not guaranteed to be optimal. Techniques foremost in combinatorial search are neighbourhood and beam search. Neighbourhood search techniques are simple, flexible methods for obtaining good solutions quickly (Baker, 1974). They start with a feasible solution, a seed, generally set using another form of heuristic, for example, a priority rule. To improve the initial solution, a neighbourhood search technique applies a preselected set of local operations recursively. The simplest method, pairwise interchange, swaps adjacent jobs in the priority list. These operations continue until no further local improvements occur. As neighbourhood search is a special case of non-linear hill climbing, it requires fine-tuning to achieve satisfactory performance. Therefore, it is rarely possible to predict how well a search strategy will perform. Instead, researchers usually evaluate and compare algorithms through a combination of empirical studies, using simulation, and commonsense arguments (Garey and Johnson, 1979). Neighbourhood searches look intensively for solutions that are close variants of the starting schedule. Other search methods were developed to overcome this myopia. Tabu search, simulated annealing and genetic algorithms, for example, extended the neighbourhood so more diverse solutions would be explored. They start with more than one seed. Their methods, for diversifying (choosing the next neighbourhood) and for reducing the search space, vary.

44

As neighbourhood search works with complete solutions, the computational time is proportional to the square of the number of jobs (Morton and Pentico, 1993). While still computationally demanding, they drastically reduce the search space. The search space is generally excessive in the job shop, since a full simulation of the job shop is necessary for each interchange. Evaluating the possible interchanges in a full neighbourhood is also demanding. An important exception is for the makespan criterion, since the interchange only requires the solving of the longest path through the shop. Schedulers use beam search to reduce the search space used in branch-and-bound. It restricts exploration to nearby solutions (Morton and Pentico, 1993). Like neighbourhood search, it starts with a seed generated by a ‘good’ heuristic. In branching, it only keeps some branches. As the method is one of partial enumeration, the solution may not be optimal. While not finding the optimum, it can identify many solutions that are nearly optimal. This is a potent feature as persons responsible for scheduling have the opportunity to consider external factors in selecting a schedule from many competing choices. While a preliminary schedule may sort job attributes, search techniques then act to optimise an objective function without regard to attribute order. For example, pairwise interchange is independent of attribute values.

3.4.3.1 Simple Priority Rules In contrast to search heuristics, priority rules construct a single solution — jobs are in priority order. Priority indices are calculated using some easily computed parameter of the jobs, operations or machines (Bauer et al., 1991). For example, a common rule is to order jobs from shortest to longest processing time (SPT). Particular priority rules have been found to satisfy specific performance measures. Except for highly restricted classes of problem, these cannot be shown to be optimal.

45

3.4.3.2 Regular Measures Applied to a Single Machine Optimal schedules can be found by applying priority rules to the static scheduling of a single machine if the performance measures are a regular. SPT produces a schedule that is optimal in regard to mean completion-time, mean waiting-time, mean lateness and the mean number of jobs in the shop. However, SPT minimises neither functions of due date nor the variance of job flowtime (Conway, Maxwell and Miller, 1967). Minimisation of the maximum job lateness and maximum job tardiness is achieved by using the Earliest Due Date (EDD) rule. If the jobs are sequenced in order of non-decreasing slack-time, then the minimum job lateness and minimum job tardiness are maximised. SPT, which does not consider duedate information, minimises mean lateness. Note that for lateness the appropriate rule depends upon its formulation: SPT minimises mean lateness; EDD minimises maximum lateness; smallest slack-time maximises the minimum lateness. With respect to tardiness, one can minimise the maximum value of tardiness (using EDD) or maximise the minimum tardiness (using slack time). However, minimising mean job tardiness is difficult, as tardiness is not a linear function of completion time. Adding to the complexity is the desire to use weighted tardiness to represent dollar penalty rates. However, there has been little progress in finding a solution algorithm for a single machine. The problem is that minimum mean tardiness is NP-hard and minimum weighted tardiness is strongly NP-hard (Lawler et al., 1993). The form of the heuristic for this non-linear measure of completion times is more intricate than the simple rules so far discussed. Examples are the Wilkinson-Irwin procedure, which uses adjacent, pairwise, interchange comparisons in the construction of the sequence, and Carroll’s COVERT rule (Baker, 1974; Morton and Pentico, 1993). Where EDD can satisfy all the jobs’ due dates (the maximum tardiness is zero), then there may be other sequences that also satisfy due dates. If one of these also satisfies another performance criterion, it may be preferable. A judicious choice, if all jobs meet due date, is to order the jobs so the mean flowtime is a minimum. A procedure that complies with this double objective is given by Smith’s theorem, which orders jobs to SPT subject to due-date constraints (Conway, Maxwell and Miller, 1967).

46

3.4.4 Non-Optimal Schedules When operations require sequence dependent set-up times, no known priority rule guarantees an optimal schedule, even for the simplest possible configuration, that of the single machine. For regular measures, only a few problem formulations exist for which a single priority rule produces an optimum. EDD, for example, minimises maximum tardiness for a single-machine problem, but minimisation of total tardiness is NP-hard (Lawler et al., 1993). The problem is the non-linearity of tardiness (see section 3.4.1). By using weights to represent costs, problems that would otherwise have fast exact solutions become too complex to solve optimally (Morton and Pentico, 1993). However, weighted flowtime for a single machine does not fall into this category (Lawler et al., 1993). It is easy to use and robust. In addition, schedules that are optimal for weighted flowtime are often still good for other objectives (Morton and Pentico, 1993). Rules that use processing time, or derivatives such as slack-time, are related to incomplete information. In practice, schedulers do not know processing times precisely until after the event. Thus, scheduling decisions are based, at best, on a priori estimates (Conway, Maxwell and Miller, 1967). Expected estimates fall somewhere between perfect correlation with actual values and independence. As the estimates need only be sufficient for ranking jobs in relative order, the estimates do not have to be perfect. A consistent bias in the estimates has no effect whatever on the quality of the schedule, if the rule does not depend on a threshold value triggering another function. The performance of a rule depends on a variety of assumptions and operating conditions. Consider, for example, a rule based on processing time. Its ability to meet a due-date objective depends on the spread of processing times. Obviously, if all processing times are nearly equal the rule’s performance will be different to a situation where the processing times are highly variable. The inference that can be drawn from the extensive discussion of priority rules in the OR literature is that the findings are inconclusive. Some notable reviews are Panwalkar and Iskander (1977), Mellor (1979), Graham, Lawler, Lenstra and Rinnooy Kan (1979), Graves (1981), Blackstone, Phillips and Hogg (1982), Rodammer and White (1988), MacCarthy and Liu (1993). Furthermore, the

47

restrictive nature of the assumptions and conditions, required for experimental investigation, is far removed from actual manufacturing environments. This is discussed later in the chapter.

3.4.5 Problem Simplification The scheduling environment is commonly much more complex than that described by static scheduling of a single machine. Sequence-dependent set ups and multiple-resource problems make scheduling very hard (Morton and Pentico, 1993). Operation researchers have therefore sought ways of reducing complex scheduling environments to simple representations. They then seek to optimise the reduced form of the problem. They argue that this sets a lower bound on the solution for the real situation. See the standard texts for a comprehensive range of simplifications (Conway, Maxwell and Miller, 1967; Baker, 1974; Ben-Arieh, 1988; Morton and Pentico, 1993). Discussion of the following restrictive approaches for reducing constraints will sufficiently delineate the proposition that scheduling is a process of managing constraints:

• • • •

All jobs are considered available simultaneously Instead of predictive scheduling, choose only the next job (dispatch) Processes are simplified (no re-entrant, no precedence) Multiple resources are simplified

3.4.6 Time Simplification Although jobs may arrive at varying times, to simplify the problem all jobs are presumed to be available simultaneously and all machines are assumed always available. The strategy is to only schedule, or re-schedule, at set times, for example, at the beginning of a shift. A heuristic that sets the priority of jobs is applied to the set of available jobs. Scheduling in advance produces lists of jobs at the machines: a queue in which the sequential order reflects priority order. The job to be loaded next is at the head of the queue. By treating dynamic arrivals as a static problem, scheduling reduces to the consideration of permutations. Except

48

for the simplest cases, the problem remains NP-hard. While SPT minimises flowtime for simple one-machine problems, minimising makespan for two machines is NP-hard. Morton and Pentico (1993) contend that considering all arrivals as simultaneous changes the problem too much to provide good bounds on the original problem. However, by setting the sequential order of the jobs in each queue it offers some predictive insight for the dynamic case (see the discussion in Section 3.4.6.2 on mixing dispatch with prediction).

3.4.6.1 Scheduling Restricted to the Next Job— Dispatch Scheduling Under static scheduling, a sequencing procedure orders all available jobs in a queue. Where there are multiple machines there may be timing constraints. For example, an operation may not begin until its predecessors are complete. The time each operation starts and finishes have to be considered to ensure that the timing constraints are not violated. A job shop acts as a network of interrelated queues. An exception occurs in applying the FIFO (First In First Out) scheduling rule under specific conditions. Then, multiple machines decompose into independent machine queues (Jackson, 1967). As problems with multiple queues are intractable beyond two machines (discussed in section 3.4.2), a simpler method for scheduling is required. Instead of sequencing the order of future operations (predictive scheduling) a job dispatching rule can be applied. It is a very simple approach to scheduling for deciding which operation to perform next on a given machine. A dispatching rule assigns a priority index to each job waiting at a given machine. When a machine next becomes free, the job with the highest priority is selected (Emmons, 1987; Sadowski and Harmonosky, 1987). Under the simplest form of dispatch, machines are never held in anticipation of the arrival of ‘hot’ jobs: idleness is not inserted. In dispatching, the job to be loaded on an available machine is selected from a metaphorical pool. At the time — or just before — the machine becomes free, the job with the highest priority index is plucked from the

49

pool of available jobs. The advantage of this type of scheduling is efficiency in the use of machines according to immediate indicators (Newman, 1988). For example, if the dispatching index uses ‘slack remaining,’ the dispatcher7 can calculate this at the time of selection. Being a localised decision that is taken at the time a machine becomes available, it: 1. accommodates dynamic arrivals 2. simplifies the complex network of multiple machines to a one machine decision problem Where arrivals are not simultaneous, problems for which optimality is realisable in the static case become indeterminate. When a machine becomes idle, the dispatcher only considers those jobs that are available for immediate loading. Therefore, it does not consider all jobs to be processed over the scheduling horizon. Dispatching jobs with the SPT rule, for example, may not produce SPT order for the complete job-set. Consequently, an objective found to be optimal for ordering static jobs sequentially would not necessarily be the optimum for the dynamic case. The most commonly used measure of shop congestion is the mean job flowtime. An equivalent measure is the mean number of jobs in the system. Because SPT is known to minimise flowtime in single-machine problems, it has been the subject of many studies for the job shop. Using tardiness as a measure, Elvers (1973) found that SPT also appears to be the best dispatching rule when a job’s due date is endogenously set at less than seven times total processing time. As the performance of a particular rule depends on operating conditions, researchers have tested various rules by simulating different manufacturing environments. Conway (1965b) found where congestion occurred (about 90% machine utilisation), the SPT rule performed better than slack per operation for mean lateness and number of tardy jobs. Blackstone, Phillips and Hogg (1982) reviewed the use of 34 rules. In comparing results across studies, they emphasised cost-based criteria, followed

7

A ‘dispatcher’, depending upon context can either refer to the algorithmic means for calculating priority indices and selecting the job with the highest index, or a human scheduler acting in a dispatch mode.

50

by tardiness, lateness and finally, flowtime or inventory costs. They concluded that SPT8 seems to be the best alternative when considering: 1. The shop does not set due dates; 2. the shop sets very ‘tight’ due dates; 3. the shop sets ‘loose’ due dates during periods of great congestion Conway (1965a) found that the SPT rule was insensitive to the reliability of the information on processing time, with deterioration of performance being very slight for estimates that were 100% in error. This is most desirable, as estimates of processing times, rather than actual times, have often to be used in practice. Conway and Maxwell (1962) found that within a multiple-machine environment, SPT retained the advantages of throughput maximisation, and imperfect information on processing times had little effect. It performed best using mean flowtime for the four methods of due date assignment considered. For exogenously established due dates, they found that SPT also minimised mean lateness and the number of tardy jobs. The principal difficulty with SPT is that some jobs are very late. By modifying the rule, late jobs can be cleared. A control parameter is used to truncate the action of SPT (e.g., any job queued over a stipulated time receives priority based on FIFO). Due-date rules are often applied. They tend to produce smaller variance in job lateness and smaller number of tardy jobs than processing-time rules. Most researchers have found that slack-per-operation consistently outperforms other due date rules. There are two ways of defining slack: static slack does not change during the time a job is in a given queue; dynamic slack is determined by using the current time rather than the time a job entered the queue. The R&M heuristic is a due-date rule that performs particularly well for weighted tardiness.9 It applies

8

For multiple operations, processing time is normally taken as the sum of the processing times for all the remaining operations. 9

The R&M heuristic named after its developers, Rachamadugu and Morton, is as w j − S +j / kpav e follows: Let the priority π j be given by π j = where Sj is the slack of pj job j at time t, k is a constant that has to be tuned to the job set and p and w refer 51

to bottleneck machines and, in modified form, to parallel machines (Morton and Pentico, 1993). The priorities are dynamic as they are based on dynamic slack and therefore change over time. It also depends upon the situation at the local machine — each job’s processing time.

3.4.6.2 Mixing Dispatch with Prediction While a pool metaphor properly characterises the dispatching process, jobs are often not as disordered as the metaphor suggests. Ordinarily, schedulers plan imminent production over a predetermined period using an appropriate heuristic: the next shift for instance. Schedulers may often roughly estimate the progress of jobs through the production system, without detailing the scheduling sequence at each machine. For key machines, their deliberations may take in sufficient detail for them to nominate a sequential order of processing. By having a sequential list of jobs, shopfloor personnel can anticipate production requirements as they know the order of work. By using dispatch they can then inexpensively react to unforeseen changes in state. The dispatcher makes the forecast schedule practicable (Morton and Pentico, 1993; Newman, 1988; Buxey, 1989). Where a scheduler has produced a static schedule as a first cut, the dispatching process starts with a list of jobs, which are anticipated to be available. These are in priority order. At the time a machine is to be loaded, the job at the head of the queue will have the highest priority. If it is not available, the dispatcher selects the first available job in the queue. Thus, a machine will not lie idle waiting for an unavailable job while there is work available. Providing pre-emption is allowed, if the highest priority job becomes available, it may replace the job on the machine10.

to processing time and weights, respectively. Slack Sj=dj-(pj+t), where t is the current time, and Sj+ = max(Sj, 0). 10

This is more relevant to pre-empt resume that pre-empt repeat. In the pre-empt repeat mode the job has to be started over again each time it is interrupted. Under this mode, it is useless to start a job that cannot be completed. 52

Some priority rules only rely on the situation at the local machine. For example, SPT uses the processing times for operations on the particular machine. While myopic, empirical studies generally find it to be robust. SPT and other myopic heuristics may perform poorly where external occurrences affect global performance. In particular, if there is a strong bottleneck downstream, then jobs that have low usage of the bottleneck rather than the current machine should be given higher priority (Morton and Pentico, 1993). Where the priority rule requires local due dates, an iterative process to estimate lead time can result in significant improvements. The local due date for each operation is estimated by subtracting from the final due date an estimate of the lead-time after the machine (Morton and Pentico, 1993; Sun and Lin, 1993). Through simulation, operation times can be predicted and performance estimated. This can be extended beyond the set of currently available jobs to include jobs not yet released but whose arrival times are known.

3.4.7 Process Simplification To reduce the number of feasible combinations, operations researchers generally restrict the degrees of freedom (Baker, 1974; Ben-Arieh, 1988; Morton and Pentico, 1993). Typically they apply, inter alia, the following restrictions: 1. Operations in a job have a fixed ordering; 2. A given operation can be performed by only one type of machine; 3. Each machine can process only one operation at a time; 4. An operation may not begin until its predecessors are complete; 5. Once an operation has begun on a machine, it must not be interrupted; 6. No processing of any operation can be done on more than one machine (no splitting) The choice of objective affects the complexity of the scheduling problem. Where the objective function is a regular measure, a scheduler can ignore inserted idletime when scheduling a single machine and under some circumstances the onemachine problem may be tractable.

53

In a classic job shop, multiple machines may have a serial-operation precedence structure. The operation order is the same for all jobs. For example, in book production the order may be printing, collating and then binding. The order of processing can be simplified by processing the jobs at each machine in the same order (French 1982; Morton and Pentico, 1993). This tightening in the permissible order structure is feasible for regular measures. This precedence structure is still applicable where not every job undergoes all operations. Where a job precedes another at one machine, it may arrive after the other at another if it has to undertake an additional operation. However, if the processing order is forced to have the same permutation sequence at each machine, then the machine would need to remain idle until the appropriate job arrives. Such a permutation sequence would then be quite wasteful. In the pursuit of simplicity, schedulers try to reduce the number of factors they have to consider. For example, they may assume that all set-up times are sequence independent and that there is no priority ranking amongst jobs (Baker, 1974; BenArieh, 1988). These assumptions increase the degrees of freedom but may result in practically infeasible or poor solutions being proposed.

3.4.8 Resources Simplification To reduce the complexity of scheduling problems operations researchers try to reduce the number of machines that have to be considered. Multiple machines are reduced to a single machine problem by either: • Solely considering the critical machine, that is, the bottleneck, or, • Through aggregation.

54

3.4.8.1 Focussing JOBS

JOBS

Performance

Performance

Job Processed by Shop

OR Model

Figure 18 The OR model simplifies the real scheduling problem to one-machine

Where one machine is overused and the others are in large enough supply so queues do not develop, then the scheduling problem reduces to scheduling the one machine forming the bottleneck. The bottleneck machine acts as a constraint on non-bottleneck activities (Goldratt and Cox, 1986). Therefore, it is scheduled first. The arrival times and due dates for the bottleneck machine have to be estimated. In simple embedded one-resource problems, the activities before or after the critical machine experience no waiting. The arrival time is then the head time, which is an estimate of the sum of the processing times for all its predecessor operations. The effective due date for an operation is found by subtracting the tail time from final due date. The tail time is the sum of the estimates for the processing times for all its post-bottleneck operations. The problem now becomes a standard one-resource problem with dynamic arrivals (Morton and Pentico, 1993). A suitable heuristic is applied to the operations at the bottleneck machine to obtain a queue sequence. The simplest strategy is to assume a permutation sequence for all other machines; the sequence found for the bottleneck machine is employed at each machine. However, as this strategy is not well suited to job shops, knowledge of the schedule on the bottleneck does not directly produce the sequencing for non-bottleneck machines. An alternative approach is that provided by OPT (Optimised Production Technology). In contradistinction to traditional scheduling practice, it does not aim to maximise machine utilisation by the conventional approach. The throughput of the complete manufacturing system is

55

limited by capacity of the bottleneck machines. Items produced at non-bottleneck machines in excess of their usage at the bottleneck add nothing to the throughput (Goldratt and Cox, 1986; Browne, Harhen and Shivnan, 1988). They merely increase WIP. Therefore, the OPT approach is to vary the size of process batches. While bottlenecks require large process batches, non-bottlenecks may require small process batches to reduce lead-time and inventory. Rather than producing at a non-bottleneck machine more items than can be used at the bottleneck, a process batch is divided into smaller transfer batches. These vary by operation and over time as they depend upon operating conditions. By increasing the number of setups on non-bottleneck machines, the flow of product to the bottleneck machines may improve. By judicious choice of batch size, bottleneck machines will not be starved of work and inventory costs will be kept low. While an explicit function for OPT’s performance objective is not publicly available, the stated objective is to maximise throughput, minimise inventory, and minimise operating expenses for any set of operating condition constraints. There are two main criticisms of OPT (Kerr, 1991; Morton and Pentico, 1993). It requires a well-defined and stable bottleneck. In many plants, the bottleneck is not clearly defined. In reality focussing on one critical resource, while pretending the other resources are in large supply, may be too simplistic as it is common for queues to form at more than a single resource. With the general classic job shop, the routes vary for different jobs. Therefore, there is not a single clear path before the bottleneck (Morton and Pentico, 1993). Second, the tightness of the OPT schedules make little allowance for unplanned interruptions. Bottlenecks can readily change. Manufacturing contingencies may cause a bottleneck to wander within the time scale of an OPT schedule. When they occur, OPT gives no guidance on schedule recovery, short of re-running the entire plan.

3.4.8.2 Aggregation By defining a resource “as a grouping of … productive capabilities with a single input queue” (Morton and Pentico, 1993), the only decisions that have to be made are sequencing and release timing of jobs or operations from the input queue to

56

the resource. Using this definition, multiple machine problems reduce to the ‘onemachine’ archetype: 1. Where there is a simple routing (typified by a flowshop) the multiple machines are aggregated and hence the problem reduces to that of a single machine; 2. A set of parallel machines is treated as a single machine. JOB: due date, processing time

Performance: meeting due dates

Figure 19 Single-machine model for parallel machines

Parallel machines are commonly assumed to have exactly the same capabilities with either identical or proportional processing speeds.11 They involve a single queue serving several machines. They are treated as a single machine with higher processing speed. Where machines are identical the processing time is the same no matter which machine is used. The processing speed is then mp where m is the number of parallel machines and p is the processing speed. A single queue feeds the parallel cluster. The job at the head of the queue is loaded onto the first available machine.

11

Some noteworthy articles on parallel machines are by Graham, Lawler, Lenstra and Rinnooy Kan (1979), Rajgopal and Bidanda (1991), So (1990), Tang (1990) and Wittrock (1990). Williams (1993) has developed a particularly efficacious method for scheduling parallel machines with set-up times.

57

The one-machine problem can often be adapted to parallel machines. For some measures of performance under static sequencing, an optimal solution may be found easily. The queue is first ordered using some priority rule. Then the next job is allocated to the machine that can finish it first. If the machines are identical, a job simply loads onto the machine that can start it first. Proportional parallel machines, for which speeds vary by a constant factor, are accordingly more difficult (Morton and Pentico, 1993). The challenge for researchers is finding priority rules that meet desired objectives. Few are available. Minimum makespan, say, is trivial for one machine but NPhard for two or more parallel identical machines (Baker, 1974; Ullman, 1976). The makespan problem being mathematically formidable, yet relatively simple to pose, dominates the literature. Morton and Pentico (1993) criticise this preoccupation by researchers on makespan applied to identical machines under static conditions. It only considers the utilisation of the last machine to finish it and ignores the use of spare capacity on other machines that newly-arrived jobs could use. While there is an algorithm for minimising mean flowtime for identical parallel machines, there is none for minimising weighted mean flowtime.

58

3.5

Light Relaxation CONSTRAINTS DEFINE MANUFACTURE Problem: no feasible schedule HEAVILY

RELAX CONSTRAINTS

Problem: Too many feasible schedules

SIMPLIFY MODEL

Problem: Which to loosen SELECTION USING GOALS

Problem: Still more than one feasible schedule

SELECT USING PERFORMANCE

LIGHTLY

Problem: conflicting goals EVALUATION

Figure 20 The path of light relaxation

Under heavy relaxation, the scheduling environment becomes quite unconstrained and the problem of finding polynomial time solutions arises. To reduce the combinatorial complexity and to make the problem more manageable some other constraints are added. These simplifications tend to make the problem unrealistic (see section 3.4.5). For example, machines do break down, actual processing times vary from estimates, some jobs do have a higher priority than others, and set-up times are sequence dependent. Under these rather common conditions the algorithms and scheduling systems developed for the classical formulation of the problem are of little practical value (Jackson and Browne, 1989). A local control perspective reduces the problem. At this level, dispatch priority heuristics are used to select the next job to be executed. While robust, their ability to effectively optimise overall performance depends on the sensitivity of the decision rule to the dynamics of the manufacturing system (Smith et al., 1990). Prediction is afforded by running a simulation of the plant. Often the constraints (e.g., standard lead times) used in devising predictive schedules through simulation are conservatively set to accommodate uncertainty in estimates and unforeseen circumstances. This gives rise to inefficient operation of the plant.

59

Practical problems, however, are more highly constrained than that used in the traditional OR formulation. Feasible choices are, at most, few and frequently there are none. It therefore becomes necessary to relax some constraints.12 Under these circumstances, complexity is not due to excessive choice of feasible schedules (i.e., combinatorial complexity), but from the number and variety of the constraints themselves. Constraints may be either inviolable or just preferable. Inviolable restrictions may come from physical constraints, the availability of machine, or may be of a causal nature (Smith and Fox, 1985). Causal restrictions pertain to operation and machine alternatives, tool, material and personnel requirements and transfer times (Fox and Smith, 1984). They are conditions that must be satisfied before an operation can be initiated. Resources and times are assigned for each activity so that they obey the temporal restrictions of activities and the capacity limitations of a set of shared resources (Fox and Sadeh, 1990). Hard physical constraints (i.e., technological constraints) cannot be relaxed. Relaxation comes through variation of preference constraints, which being but preferences set by humans are flexible within some bounds (Fox and Smith, 1984; Kempf et al., 1991). Preference constraints may derive from either explicit organisational goals or unstated tacit preferences of the scheduler. Organisational goals are measures of the performance of the organisation. From the organisational level there are expectations on performance regarding the meeting of due dates, the amount of work-in-progress, the maintenance of adequate machines, productivity goals and shop stability (Fox and Smith, 1984; Newman, 1988; Hsu et al., 1993). Dividing a job into tasks that can be completed by a particular machine and setting the sequential order of operations on a machine, a scheduler places constraints on work activity. The multiple criteria goals can be expressed as preference constraints, which may be in conflict. Depending upon the system’s state some goals will be perceived to be preferable to others.

12

Minimally, constraints have to be sufficiently relaxed to obtain at least one feasible schedule. That is, for a schedule to be executed it has to be at least compatible with the technological constraints (French, 1982).

60

3.5.1 Artificial Intelligence In recent years, the AI community has applied Knowledge-Based Systems (KBSs) to the scheduling problem. While the OR approach disregards all but a few quantitative indicators, AI can incorporate diffuse factors, which may be both quantitative and qualitative. KBSs come in various forms. Some use rule-based systems, others are based on frame representations. Some only use heuristic rules to construct a schedule, while others conduct a constraint directed state space search (Blazewicz, Domschke and Pesch, 1996). Kusiak’s Knowledge-Based Scheduling System (KBSS) combines knowledge in a KBS with search (Kusiak, 1990). Scheduling instances that cannot be handled algorithmically are referred to the inference engine. The inference engine uses production rules, not related to dispatching rules, to generate a schedule. Frames are used to represent the declarative knowledge in describing the production problem. The KBSS was evaluated on a set of test problems using three measures of performance: maximum flowtime, average flowtime, and machine utilisation. Kusiak claims that the KBSS schedules are of good quality, and are usually within a couple of percentage points of the optimum. The CPU time is modest; an indication that it may be suitable for real-time operations. Constraint-based reasoning, which reacts to the shop’s current state, has been found useful. For example, schedule construction with ISIS, a knowledge-based system for factory scheduling, is cast as a constraint-directed activity that seeks feasible schedules that satisfy the constraints placed on jobs and machines (Fox and Smith, 1984).13 OPAL combines different dispatching rules and constraints using fuzzy logic. Constraint directed reasoning is first applied. Then the expert system classifies the permutable operations based on rules (e.g., SPT, slack time) in regard to meeting a given objective (Grabot and Geneste, 1994; Brown, Marin, and Scherer, 1995). The OPIS scheduling system introduced the notion of

13

While the domain information that is represented in its knowledge base is much more extensive than that used by OR heuristics, ISIS does not capture the constraints used by schedulers in job shops that were encountered in McKay et al. (1988) extensive study of schedulers.

61

opportunistic scheduling (Sadeh and Fox, 1990). The opportunism comes from the ability to detect the emergence of new bottlenecks during the construction of the schedule and revise the current scheduling strategy (Sadeh 1991). It uses constraint-based reasoning to direct problem solving in scheduling towards the most critical decisions that remain to be made, or revised, by considering areas of resource contention and schedule conflicts (Smith et al., 1990). It views all operations causing a bottleneck as critical. The system is constantly on the lookout for ways to alter or develop its plan of action based on the latest information. Heuristic scheduling techniques are thereby directed by knowledge of the active constraints and objectives. It has reactive scheduling capabilities. In the presence of unexpected events such as machine breakdowns or new job arrivals, the system can patch the current schedule. This is done by opportunistically firing a selection of specialised heuristics (Sadeh 1992). In stating that opportunistic methods can yield optimal or near-optimal solutions, even for problem instances up to now considered difficult, Blazewicz, Domschke and Pesch, (1996) refer to the work of Adams, Balas and Zawack (1988), Ow and Smith (1988), Sadeh (1991), Balas, Lenstra, and Vazacopoulos (1995), DauzerePeres and Lasserre (1993) and Balas and Vazacopoulos (1995). Crucially, they warn that the way that a problem is decomposed affects the quality of the solution. Another constraint-based approach employs the user to guide the scheduling effort. In Elleby, Fargher and Addis’s (1988) system, the human identifies desirable and undesirable features of a proposed schedule. The system constructs a schedule that is presented to a human expert for comment and critique. The knowledge obtained from the human scheduler results in a requirement, or constraint, that the system stores for future use. The schedule generator applies an incremental constraint satisfaction routine using a backtracking approach. For a constraint-based scheduler to act effectively, all factors that are important in setting the schedule have to be heeded. Alas, this is also the root of an important shortcoming. To automate fully the construction of schedules, the system has to cater for all circumstances that could ever arise. Many factors may have to be thought about: some of these are, the presence of multiple machines and multiple routings, operation precedence, job priorities, random failures, availability of material, changes in production goals, and the call to expedite some jobs. While 62

these factors probably can be placed in a rule base, the cost for their capture is usually high (Rodammer and White, 1988). Companies are reluctant to expend enough time and money on creating bases that are large enough to make wellformed decisions. There is also a perverse negative to having well-developed rule bases. Rule bases that are extensive exhibit brittleness. Even if a rule base is large enough, what happens when the environment changes? Machines may change. Products may change. Unforeseen methods or materials may be introduced. These are but some possible changes. What then happens to the rule base? Is it upgraded? Are the changes ignored and its advice accepted? Is the schedule used only after whosoever holds responsibility has vigilantly made the necessary modifications? Factors that may result in quite divergent scheduling outcomes have often to be reconciled when a schedule is developed. How can this be done?

3.6

Perplexity

Perplexity as well as complexity makes scheduling difficult. A problem’s complexity is associated with its size: the number of possible ways to lightly relax constraints or the number of permutations under heavy relaxation. Confusion and uncertainty are the hallmarks of perplexity14. Uncertainty can enter a problem in two ways (French 1982): 1. It can arise because of our inability to measure anything perfectly. The underlying problem may be deterministic in the sense that all quantities are fixed; however, the values of these quantities may be uncertain. The question of robustness then arises.

14

Perplexity comes from the Latin word perplexus meaning involved. The Oxford dictionary defines perplexity as the “Inability to determine what to think, or how to act, owing to the involved, intricate, or complicated conditions of circumstances, or of the matter to be dealt with, generally also involving mental perturbation and anxiety.”

63

2.

It can arise because the quantities in a problem are inherently variable. For instance, the processing times of apparently identical operations may vary considerably. There is uncertainty in arrival times, scrap rates and machine breakdowns.

Scheduling parameters such as processing times, material arrival times, machine availability, etc., are subject to uncertainty (Kempf et al., 1991). Processing times and material arrival times are ‘crisply’ defined scheduling parameters. While they may have uncertain values, they are independent of other scheduling parameters. However, assigning times for processing depends upon machine availability, which depends upon reservations already made on the machine for other operations. Scheduling methods either need to reflect explicitly the uncertain nature of the available information or give some guarantee as the insensitivity of the schedule to future information. Constraints may also be vaguely defined. For example, an operation’s due date depends upon the times elapsed between successive operations. These may be specified only approximately. Where a constraint is dependent on other events, uncertainty accumulates with the number of events. Hence, imprecision increases as uncertain dependencies increase. For example, the due date for each operation in a multiple operation depends upon the timing of downstream operations. A schedule will only be an approximate guide to shopfloor reality, unless it is based upon forward loading to finite capacity and employs simulated waiting times not predetermined averages (Buxey, 1989). Uncertainty also arises in constraint relaxation. When relaxing constraints, a scheduler needs to choose which to relax. Where their relative importance is ill defined, selection is made even more difficult. There may be uncertainty in the meaning attributable to a constraint’s value. Different departments or persons who can influence decision making may see particular constraints quite differently. A sales representative, a production supervisor and a customer may hold quite different views on the firmness of a due date. While a customer may not be unduly concerned about a late delivery, a sales representative may see a late delivery as a threat to his or her reputation. The value placed on any particular factor is an outcome of the interplay between interested persons and groups. There may be uncertainty in the meaning attributable to a measure of performance. For example, maximisation of utilisation

64

is normally considered a worthy operational objective, as an idle machine does not contribute to return on investment. However, its interpretation, which seems a simple quantitative measure, is contingent upon the extent that bottlenecks dominate the manufacturing system. Increasing the utilisation of a non-bottleneck machine may merely cause WIP to increase without altering the total throughput (see section 3.4.8.1). Uncertainty arises where bottlenecks are not singular, but instead form when operational demands on particular machines exceed their capacity. Which machines are bottlenecks may therefore vary over time. Where bottlenecks are not manifestly restrictive, queues that form may quickly abate. Consequently, an average utilisation of less than 100% may disguise the formation of a bottleneck. The scheduler therefore has to take care in attributing meaning to the value of average utilisation. For some factors, a correct reading of their meaning depends upon the context. For example, in the printing industry, an operation’s processing time depends upon the type of ink and the weather. The ink’s drying time may depend upon the density of the colour, the ambient temperature and the relative humidity. Processing speed and, consequently, processing time depend upon the drying time. In recent work the AI community has recognised that some constraints, such as due dates are contextually dependent. Their significance may vary with the job and the working environment (Fox and Sadeh, 1990). For some customers the due date is rigid. For others, they may not mind receiving their orders a day or two late. Yet again, these very same customers may have jobs in the system with due dates that are atypically firm. While AI approaches can to some extent deal with context, generally they are unable to decipher qualitative differences in meaning without them being stated explicitly (Papantonopoulos, 1990). McKay (1997) categorises context-sensitive elements of a scheduler’s task into information availability, constraint influence and relaxation, objective influences and demand, and work assignment and sequencing decisions. As information arrives in chunks periodically during the scheduler’s day, the accuracy of the scheduler’s mental picture of the system status varies over time. For example, on their arrival at the plant, schedulers usually have to quickly assess the current state of production using partial information and then make a series of decisions regarding production over the current shift. The scheduler’s knowledge of the

65

constraints on the capabilities and performance of resources and the possibility and means for their relaxation are also context dependent. For example, McKay points out that the scheduler’s approach to scheduling may be more reactive than predictive while he/she learns the capabilities and limitations of new equipment. The scheduler must also deal with objectives that derive from context-sensitive constraints, as shown by the above due-date example. Work assignment and sequencing decisions may be context-sensitive, for example, at the beginning of the day. To get all major resources and personnel working productivity, work has to be assigned immediately. The scheduler makes decisions on the fly, for instance, to cover gaps due to absenteeism and shortage of material. In constructing or tuning the schedule later in the day, more measured decisions can be made. Different organisational units affected by the schedule may have different and conflicting goals (Gary et al., 1995). A marketing department seeks good due-date performance. A manufacturing department prefers high machine utilisation, few set-ups, and long production runs. What metrics should guide a scheduling system to ensure that the schedules it generates are consistent with long-term goals is a difficult question. It is very hard to reconcile a goal like maximising profit for this quarter with decisions made while scheduling an eight-hour shift. Developing a coherent set of performance measures for all levels of the corporate hierarchy is an unsolved problem (Fisher, 1992). Goals may not only potentially be in conflict (see section 3.4.1) but also imprecisely defined (Kempf et al., 1991). Customer satisfaction has a much broader and ill-defined meaning than the reified due date surrogate conveys. Goals often go beyond the limited set of overtly economic objectives associated with minimising costs and maximising utilisation. In practice, human schedulers frequently try to minimise operating stresses (Rodammer and White, 1988). To realise this objective they may apply measures that improve schedule stability, reduce confusion, and placate demanding customers. To meet these objectives, they may deliberately under-utilise some machines. Such goals as these may be neither clear nor explicit. Human schedulers pursue goals that they have not articulated. They do so by embracing practices they believe to be good. These practices often evolve over years of experience with the production process.

66

3.6.1 Deep Knowledge and Constraint Relaxation Producing broad predictive schedules using average lead times, or queue length, may promote local efficiencies due to reliance on large levels of WIP to provide leeway for uncertainties (Buxey, 1989). This may be somewhat ameliorated using simulated forward loading in environments where operating conditions are quite predictable. Such order, more typical of flow shops, is uncommon in job shops. Buxey notes that job shops rely greatly on rough estimates, technical inputs from skilled workers, and reactive production control. The narrowness of the classical formulation overlooks accepted shopfloor practices for improving performance. Depending upon circumstances surrounding a particular situation, batch sizes may be varied. In an environment where operations normally are not overlapped, for crucial jobs it may be considered as an option. In addition, jobs may be split or overlapped (Browne, Boon, and Davies, 1981). In real job shops, schedulers take account of many factors that go beyond the confines of information displayed on a Gantt Chart or a machine-loading board, or held within a scheduling system. Their scheduling decisions are affected by plant operating conditions, which go beyond the job and machine availability. Broader information is required to make decisions that take in, inter alia, labour allocation (e.g., absenteeism, skill distribution, overtime and extra shifts), the availability of tooling and raw materials, use of subcontractors, and the use of alternative job routes. Under special circumstances — for example, when a machine breaks down, when the characteristics of the job are abnormal, or when the demands on a machine’s operating time are exceptionally high — whoever is responsible for production may make changes to the machines. Creative solutions can often be found through a deep-seated understanding of the machines that are available. Parts may be removed from one machine to be placed on another. Machines may be modified, or used, in ways the designers had not realised. When an operation places extraordinary demand on capacity, a decision may be made to extend capacity by: • Increasing a machine’s processing speed beyond normal limits; • Using machines that ordinarily would not be used for the particular operation; • Subcontracting.

67

A machine put to exceptional use may either require a minor adjustment, as would be the case if a mill were used as a drill, or significant modification. Using a machine in an exceptional way requires changing normally inviolable constraints. Schedulers need to deeply understand how a machine functions to recognise the conditions and requirements for violation to be admissible. Increasing a machine’s processing speed beyond normal limits requires a scheduler to be aware that it is possible and to appreciate the repercussions of the action; shorter life, lower quality, etc. To subcontract operations a scheduler needs to know which companies can carry out the work within the bound set by the constraints (e.g., due date, geometric tolerances and material properties). Deep understanding of the manufacturing domain is necessary when multiple objectives are pursued. AI systems have been developed that apply objective functions that are not unlike those found in OR (Fox and Smith, 1984). Where objective functions include disparate factors, some agent must decide how to equilibrate them. Who is this agent? How does it contend with idiosyncratic constraints or preferences? MICRO-BOSS, a bottleneck scheduling system, alleviates this problem by using a singular ability of humans: the propensity to look at a situation and see from the context what is, and what is not, essential (Sadeh, 1991; Papantonopoulos, 1990). Classical OR approaches scheduling complexity by minimising information. Jobs are described by a few defining attributes (see section 3.1). Characteristics of the real problem are ignored: set-up times are assumed to be predictable; measures of performance not simply directed towards profit maximisation are disregarded. Scheduling heuristics are also minimalist. They use a few job attributes, at most. For example, in many problems a ‘greedy’ (or myopic) type of heuristic is used. While being very easy to implement, it ignores everything that happens beyond the immediate step. High productivity is not achieved in practice from mathematical calculations but by engineering knowledge. Job shops rely upon experienced supervisors progressing work to take advantage of common requirements for tooling, fixtures, etc., and available labour and machines (Buxey, 1989).

68

3.6.2 Uncertainty and Robustness A scheduling system must accommodate the many different situations that develop on the shop floors of real manufacturing establishments. There may be minor changes to overcome unforeseen disturbances. Adjustments may be necessary because of the disparity between actual and estimated values of some constraints. The system therefore needs to be sufficiently flexible to address the various contingencies that arise (Buxey, 1989; Svestka, 1984). Scheduling by dispatch is inherently robust, as a dispatcher normally uses the current shopfloor state to make a decision. A forecast schedule is considered robust if it remains valid under many different types of disturbance. Robustness is clearly a desirable attribute of a predictive schedule, as it will reduce the number of subsequent reactive scheduling decisions required as the schedule is executed. Wherever realtime scheduling is used with a production plan (i.e., a forecast schedule), a scheduler attains robustness by making allowances for minor changes. For example, large flow times in the forecast allow flexibility in processing times, queue delays, etc. Instead of planning for maximum utilisation of all machines, the forecast schedule may build in idle time to provide leeway for actual operating conditions mismatching estimates and other contingencies. These practices therefore allow a real-time scheduler (i.e., a dispatcher) sufficient degrees of freedom to react to disturbances, without having to repair the original schedule. By so limiting the repair the lack of continuity in detailed shop floor plans — known as nervousness — that accompanies scheduling instability is avoided. However, this flexibility is costly as there will be a high level of WIP and underutilisation of machines (Erschler and Roubellat, 1989). A decentralised control structure makes managing uncertainty easier and therefore provides greater robustness. It reduces complexity. Each sub-system works within its own parameters and guidelines to plan and control the flow of work through the manufacturing system (Solberg, 1989). Machine assignments can be negotiated in real time. It allows the formation of independent software objects. Each of these objects can contain well-defined functions.

69

3.6.3 Perplexity and the Need for a Paradigm Shift The dominant approaches to scheduling, whether classical OR or AI, are posited on knowing the behaviour of the manufacturing system a priori. For the behaviour to be predictable, the system has to be highly deterministic with any estimated factors having tight probabilistic bounds. To meet these conditions a manufacturing system needs the following attributes (McKay, Safayeni and Buzacott, 1988): • A stable and well understood simple manufacturing process; • Simple manufacturing goals that are not affected by hidden agenda; • Short cycle times so that work can start and finish without interruption; • Predictable and reliable set-up and processing times; • Known delivery quantities, delivery times, and delivery qualities; • Long times between failures compared with cycle times, and short repair times;

• Accurate and complete information on processing requirements and on the status of the jobs in the computer. Flowshops that manufacture only a limited range of standard products may fit this mould. Work flow, while perhaps not optimal, tends towards the routine. Production costs and the size of inventories of finished goods occupy the attention of production controllers. From their extensive study of real job shops, McKay, Safayeni and Buzacott (1988) found that none satisfied even a small portion of the above list of attributes. Their production is much more diverse than that of flowshops. They compete for a variety of individual orders from different customers. Orders are won on price and delivery dates. As the workflow is not routine, production controllers have to focus upon its management taking due regard to performance criteria (Buxey, 1989). However, in an environment where jobs commonly arrive unheralded and with short lead times, production controllers lack time for detailed planning. McKay, Safayeni and Buzacott (1988) found that typically a shop is seldom stable for longer than half an hour. Something is always happening unexpectedly: the effects of which normally last longer than the batch processing time for the work in process in the area affected. New jobs usually arrive before previously scheduled jobs in the system have been processed. The new arrivals may make the prevailing plan irrelevant. The state of the shop may restrict choices available for amending the schedule. For example, it 70

may be impractical to alter the place of some jobs in a queue. Often jobs for which processing is imminent have already placed calls on resources and materials. Reversing these calls may be difficult. Re-allocating materials that have been earmarked for one job to another job can cause difficulty in tracking materials. If the materials for an operation consist of parts produced by preceding operations, associating materials and jobs can be quite a perplexing activity. Changes may also be restrained to limit chaos and confusion at the shop floor due to the chopping and changing the order of work. Under these circumstances, if not all jobs in the queue are available for revision, the justification for applying a particular heuristic becomes questionable. Simplifying assumptions, intended to remove the computational complexity, also make the problem less relevant to actual practice. Theoretical models that perceive shops as simple, stable and well understood with predictable and reliable set-up and processing times, known delivery quantities, times, and qualities do not fit reality. This characterisation is far from the shop-floor reality for most cases (McKay, Safayeni and Buzacott, 1988). Buxey (1989) questions the value of any production schedule, given that the following factors could apply:

• An unpredictable level of absenteeism • Equipment breaking down • The volume of information to be handled allows capacity requirements to be calculated for aggregates only • Time spent queuing at process stages, and for transport between them, is highly variable • Operation times used for planning purposes are rough estimates • Customers (or the marketing department) may cancel orders at short notice, alter design specifications, order quantity, delivery date, etc., even after work has commenced • Following quality inspection, items may be scrapped, downgraded, or scheduled for reworking McKay, Safayeni and Buzacott (1988) contend that “The problem definition is so far removed from job-shop reality that perhaps a different name for the research should be considered.” They questioned the relevance of its theoretical 71

formulation: a formulation in which the underlying assumptions and structure have remained virtually unchanged for 30 years. Their opinion comes from surveying 40 schedulers and conducting four, informal and explorative, case studies. They found the concerns and needs of practising schedulers differed from OR interests. Over a series of job-shop scheduling seminars, of which more than 200 schedulers participated, they affirmed that some of these concerns and needs were widespread. Practising schedulers live amongst perplexity. They have to satisfy many stated and unstated conflicting goals, using hard and soft information that is possibly incomplete, ambiguous, biased, outdated, and erroneous. Goals may be neither clear nor explicit. They pursue goals without fully articulating them by following practices they believe to be good, from years of experience within the industry. Most schedulers would find it alien to measure performance using aggregated linear functions that are weighted sums of constituent goals. Any issue at some time can affect the scheduling decision (McKay, Safayeni and Buzacott, 1988). Factors that are paramount change with time, date, mood, climate, and so forth. Not all constraints and goals may be active simultaneously. Constraints may include environmental and seasonal conditions, transportation, raw material, type of work, labour force, and labour rates. Goals depend upon the hour or the day and constraints change: “what is a ‘good’ schedule generated Monday morning may be considered a ‘bad’ schedule if generated Monday afternoon” (McKay, Buzacott and Safayeni, 1989). Therefore, a scheduler needs deep knowledge of the working environment to formulation good practice. Schedulers cannot ignore the predicaments of the real world. Work environments are generally unpredictable. Unplanned events occur frequently during the scheduled period. People often have to schedule in circumstances where procedures are ad hoc, records are kept on paper, and decisions are made that are almost arbitrary and outside the control of the scheduler (Solberg, 1989). The strategies schedulers pursue often do not adhere to classical assumptions. They: • Assign priorities to jobs • Change the size of batches • Split operations between machines and overlap operations to speed up work • Interrupt operations to run more urgent jobs 72

• Renegotiate due dates with customers to spread the work load, and • Use machines in non-standard ways to increase short-term capacity. Almost any factor, constraint, or goal can have variety. Certain combinations of variety may or may not have effects on other constraints or goals (McKay, Buzacott and Safayeni, 1989). Where such factors are myriad and subjective, algorithmic dispatching rules alone ill equip schedulers for carrying out their responsibilities. In practice, the objectives sought are ill defined. Hence optimality is hard, if not possible, to define. Therefore seeking optimality is inappropriate. Consequently, scheduling activity cannot be based on procedures that will ensure near-optimal schedules. As it is impracticable to conceive scheduling goals as the optimisation of performance measures, a more appropriate objective is to seek efficient performance. Efficiency relates to practicable issues: effort spent on scheduling activity compared to improvement in a performance measure (which probably relies on estimates and possibly erroneous data), robustness, nervousness. An efficient schedule is sufficiently flexible to allow for contingency replanning while retaining a high degree of overall stability (Buxey, 1989). It is preferable to produce and maintain satisfactory schedules over time. Instead of trying to marginally improve performance during reactive scheduling or dispatch, a predictive schedule that satisfices some goals should be maintained where possible (Kempf et al., 1991). In scheduling, theoretical methods are the antitheses of praxis: their goals differ; they use different information; and they observe different practices. Therefore, a paradigm that sees scheduling as algorithmic methods for idealised problems needs to be replaced by an approach that encompasses planning the flow of work in complex and perplex manufacturing environments. Morton and Pentico (1993), in their exceptional treatise on the mathematical formulation of scheduling problems, declare that “Breakthroughs in scheduling methodology, practice, and software are sorely needed. All useful approaches should be pursued.”

73

3.7

Human

Extensive OR and AI activity into automated scheduling have had little impact on scheduling practice in discrete manufacture. Production scheduling remains a skilled craft practised by experienced human schedulers (Rodammer and White, 1988). They will often use a simple scheduling aid such as a machine-loading board or Gantt chart. This will be complemented by pencil, paper and calculator for carrying out rudimentary calculations (e.g., processing time) to obtain rough estimates of machine utilisation, idle time and possible constraint violations. Machine-loading boards merely hold an ordered list of job operations waiting to go onto each machine. Commonly, separate tags, known as tags, are used for each job. The information that is written on these tags is usually rudimentary. A tag’s position shows when the job will be loaded relative to others in the queue. However, schedulers can see neither the expected time of loading nor how long the operation will take to execute. Moreover, schedulers can neither see when machines are forced to wait for jobs to become available nor see when jobs are expected to be ready for delivery (Gibson and Laios, 1978). Information on the tags is usually limited to job attributes. Machine attributes (e.g., machine utilisation) and process attributes (e.g., processing time) are not shown and the scheduler would have to calculate them. Therefore, constraints that are violated may not be apparent. To some extent, Gantt charts overcome this problem as they clearly display processing times, machine reservations and expected delivery dates. Hence, schedulers can rapidly evaluate machine utilisation and lateness for any schedule they propose. There has not been any notable change since Graves (1981) wrote that scheduling systems were predominantly manual, especially simple environments with a few processing steps. He stated that it is often not clear to an observer how exactly schedulers construct schedules and compare and evaluate alternatives. Schedule evaluation, however, seemed to him to be qualitative. The dominant schedule criterion claimed to be schedule feasibility, although other criteria such as schedule flexibility may be important. He noted that such systems seem to work in that the generated schedules are viewed as being quite satisfactory.

74

Fox and Smith (1984) found for the company they studied that only 10% to 20% of the human scheduler’s time was spent on scheduling. The scheduler spent between 80% and 90% of his time identifying problem constraints by communicating with other employees. While this company may not be typical, it focussed the interests of the team from Carnegie-Mellon University (CMU) on the primacy of constraints.15 Many constraints or issues can affect the scheduling of different parts of the shop at different times for different reasons. In a later case study from CMU, Hsu, Prietula, Thompson and Ow (1993) also observed that an expert scheduler approach to scheduling was mostly constraint-driven rather than trying to optimise an objective function. From their field studies McKay, Safayeni and Buzacott (1988) compiled about 200 constraints that applied to operational, physical, process planning, work force and administrative issues. In his field study McKay (1987) found that only a few per cent of constraints used by schedulers in decision making are normally supplied facts. The remainder were semantic relationships requiring inference and induction. When judging the potential difficulties likely to arise in processing a job, schedulers may compare the current and previous states of possible machines. They take interest in anything that has changed on those machines since the last time the item was made (materials, tooling, fixtures, maintenance, policies, worker training, procedures, etc.). McKay (1987) sees this activity as dominant in the decision process. They often form quite complex models. For example, the scheduler in Kerr and Ebsary’s (1985) study of a small manufacture of pumping equipment used 430 rules. They included priority rules, forward loading rules, dispatch rules, contingency rules, and time conversion rules. These were not static but changed continually.

15

A proposed schedule is distributed to persons in every department. Each person on the distribution list may provide information that may alter the schedule. Quite unlike the average job shop, lead times of up to three years may occur.

75

Depending upon the system’s state and the goals sought, schedulers attend to key issues and ignore other parts of both plant and work. Their pursuit is the perplexing, non-routine factors causing snarls at bottlenecks. They do not waste time on the routine or mundane. They just release them and let other persons working in the manufacturing system worry about the details. Perplexity extends beyond the usually accepted sources (i.e., arrival times, scrap rates and machine breakdowns). It may depend upon the attitude of the workers or other qualitative attributes that may not be always relevant. Its degree may be contextually dependent. An item that is complex for one shift, for example, may not be complex for another.16 Human schedulers, unlike classical OR methods, can handle information that is qualitative and context-specific (Papantonopoulos 1990). Using intuition, they fill in the blanks about what is happening, and, what can and what will happen, on the floor. This includes sensory data and a mental model of the situation (McKay, Safayeni and Buzacott, 1988). In real scheduling environments there are many dynamic factors. Machines may break down; operators may be absent; forecast planning may be poor; there may be unplanned changes in system status. Some examples of unplanned changes are the arrival of new jobs, reworking, running out of stock, changes in priority due to unanticipated changes in capacities, costs, or due dates. Human schedulers must deal directly with these contingencies. They may be left with insufficient time to update databases to reflect the change of state, assuming they had the necessary predisposition. For knowledge of a system’s current state, production controllers cannot rely on inaccurate databases. However, humans can identify a system’s state when confronted with incomplete and ill-defined data. They can make decisions and then act upon them, under conditions that are infeasible for a computer-based system (Ammons, Govindaraj and Mitchell, 1986; Johnson and Wilson, 1988; Sharit, Eberts, and Salvendy, 1988; Tabe and Salvendy, 1988, Sanderson, 1988).

16

Maybe particular machines or skilled operators that are available for one shift may not be so for another. Perhaps environmental conditions differ between the night and the day shift.

76

Earlier in this chapter, constraints were shown either inviolable or preferable. Constraints tend to be imprecise. Release and due dates are often vague. Preference constraints represent the subjective preferences of a human scheduler or derive from strategic organisational goals. Where goals conflict, current preferences reflect a resolution of the conflict. The form of the resolution depends upon the interplay between persons representing the different organisational positions. To carry out their tasks schedulers need the means to quickly and easily (Gibson and Laios, 1978): 1. Check for constraint violation; 2. Identify the limitations of the current schedule; 3. Identify the means of improving the schedule.

3.7.1 Difficulties Experienced by Human Schedulers Schedulers only supported with simple aids often find scheduling laborious. Moreover, the responsibilities and duties of scheduling personnel normally extend to the operational performance of the manufacturing system. At times, their attention may focus on monitoring the system for problems, troubleshooting problems that arise, and instigating actions to restore it to an operational mode (Aström 1985, Fox and Smith 1984, Jones and Maxwell 1986). Where equipment is expensive, shopfloor personnel may be under intense pressure to maximise performance (Zimolong 1987). Moray et al. (1990) found that subjective workload in undertaking scheduling tasks is strongly dependent on time pressure. When they have little time to plan their decision-making strategies, system performance deteriorates. Feeling exceedingly challenged, schedulers curb their attention to individual decisions and their immediate effects (Kempf et al., 1991). In studying scheduling practice McKay, Safayeni, and Buzacott (1988) observed that schedulers avoided long-term detailed scheduling. As the dynamics of the shop change, many schedulers only use simple dispatching rules applied to very short time horizons. They restrict themselves to short time horizons because they are unsure how changes that they make ripple through the schedule affecting other parts. The challenge is to extend their horizons so they develop a feeling for the global effect of their decisions. 77

An early study on flexible manufacturing systems by Stecke and Solberg (1977) drew attention to the serious under-utilisation of the capabilities of a system relying upon manual scheduling. Sanderson (1989) noted the following difficulties that confront human schedulers: 1. Understanding events that may be changing rapidly, but having outcomes that may not be clear for days or weeks; 2. Mentally integrating the locally independent states of the discrete units they are simultaneously watching to estimate the overall system state; 3. Determining trends from a set of discrete states set widely apart in time; and 4. Reacting quickly when system components are tightly coupled. High interactive complexity between operations, accentuated by tight coupling, makes scheduling problems prone to instability (McKay and Wiers, 1997). An inexact mental picture of the state of the manufacturing system may affect their performance. They may not have up-dated their knowledge of the current state of the system. They may have lost their long-term mental model of the system’s functions and structure (Bainbridge, 1983; Mitchell and Miller, 1986; Sharit and Salvendy 1987).

3.8

Actively Engaged Human using Computer-Based Tools

Automated scheduling techniques, where applied, are rarely autonomous. A forecast schedule produced by a computer, typically for a week, invariably has to be corrected. Using a computer to completely reconstruct a schedule (often taking hours) after every minor breakdown or change in arrival times is impractical. Consequently, reactive scheduling tends to be completely manual (Morton and Pentico, 1993). In doing so, humans have to identify the system’s state, decide upon the course of action, and then carry it out (Ammons, Govindaraj and Mitchell, 1986; Johnson and Wilson, 1988; Sharit, Eberts, and Salvendy, 1988; Tabe and Salvendy, 1988). As humans have to regularly revise the forecast, users of the system tend to lose confidence in it, leading in many cases to its demise. By pursuing the strengths of traditional operations research, knowledge based systems, and sophisticated user interfaces, Morton and Pentico (1993) see useful 78

scheduling systems emerging. The argument presented in this chapter is that scheduling production is the management of constraints. Therefore, a constraintbased scheduling system should produce feasible schedules (Fox and Smith, 1984). For the system to be effective — that is, to operate without human intervention — it has to cater for all factors and circumstances that could ever arise. This depends upon a knowledge engineer being able to collect all the rules a scheduler has in his/her repertoire. Alas, this is its Achilles heel. While rules can be written for many factors, their capture is usually costly. Many rules, however, cannot be readily captured. Some rules are subtle and therefore difficult to codify. Others may be controversial; disagreement among experts may depend upon their organisational position or their experience. As humans create new expertise in new situations, rules may not be available at the time of capture. Rules stated to a knowledge engineer or actions interpreted by an observer may only model surface expertise, as humans apply broad-based tacit knowledge.17 Knowledge-based systems require all pertinent information to be articulated a

priori to reduce deductive and inductive inferences to simple associative inferences (Sutherland 1986). This causes difficulties for managing perplexity. However, humans can bring to the decision-making process special competencies pertinent to scheduling (Sheridan, 1976). They can handle unexpected events and formulate general rules from specific cases (Sharit, 1984; Meister, 1966). Humans can apply inductive logic to get beyond the imperative for computers to have all pertinent knowledge articulated a priori. Using their tacit knowledge, they can handle exceptional problems such as lot splitting, enforced idleness, re-routing due to breakdowns and expediting important jobs that are tardy (Morton and Pentico, 1993). Generally, computers are seen to be advantageous for manipulating large amounts of data and for automating procedural steps. Nevertheless, when the number of conceivable states in discrete-parts manufacture is large, an automated system may be unable to cope with all the combinations it has to search. On the other hand, humans may find such situations

17

It is common for developers of scheduling systems to capture only shallow expertise and thereby they inadequately comprehend the dominant domain constraints (Kempf et al., 1991).

79

quite simple to manage. Their perceptual abilities help them to discriminate between contextual information used in scheduling, such as those relating to costs associated with lateness and the assignment of scheduling priorities (Papantonopoulos, 1990). By using their abilities to use inductive logic and to readily recognise patterns in the data and identify what is, and what is not, essential, they can narrow the search space (Sharit, 1984). By intervening so, the number of constraints that the computer has to manipulate is reduced (Sanderson, 1988). Humans working in the particular environment understand many aspects of scheduling jobs within a plant best. Within the context of local knowledge, they know how to handle information that may be diverse, inexact, or conflicting. Instead of focussing on mathematical techniques, the problem’s locus should move to the needs of persons who have to take responsibility for the planning of production. It provides an avenue for the creation of a general-purpose scheduling tool. All the complex problem attributes in their full richness are left to the human. A computer-based tool has to capture only the most important features of a given environment, ignoring the least important entirely, and dealing with those of intermediate importance in an aggregate way (Rodammer and White, 1988). Therefore, the pitfalls of massive data requirements and slow execution are avoided. Humans and computers acting in consort can exploit their respective advantages. Experimental studies suggest scheduling based on actively engaged humans using computer-based scheduling tools often outperform humans or computers, acting by themselves (Bergeron 1981; Dunkler et al., 1988; Haider, Moodie and Buck 1981; and Sorkin and Woods 1985).

3.9

Summary

In this chapter, the features of scheduling that are pertinent to the investigation of ‘hybrid’ automation were discussed. By regarding scheduling activity as the management of constraints, the limitations of various approaches to scheduling were argued. Jobs were defined in terms of a list of attributes that constrain form

80

(i.e., materials and geometry) and time (due date). Machines were seen to be constrained in their capabilities and availability. Operation-machine entity relationships, constraining manufacturing time (i.e., time reservations are placed on machines), were shown to form when jobs are allocated to machines. Schedulers were seen to commonly encounter situations in which they cannot find any schedule that meets all the constraints. To make such problems manageable they would be forced to relax some constraints. While some constraints were seen to be inviolate (e.g., technological constraints) others were shown to be only preferential. The argument presented asserts that the classical OR approach is the heavy relaxation of constraints. This results in many feasible schedules. The problem then becomes one of finding the most suitable, judged against some performance measure. As problems extend beyond a few jobs, they become too complex to solve exactly. Heuristic methods are then generally applied to find an appropriate schedule. Heavy relaxation was shown to have an extreme drawback; the transformed problem is often far removed from the actual shopfloor problem. The alternative, light relaxation of constraints, was then considered. Feasible schedules are few. The issue then becomes which constraints should be relaxed to produce an appropriate schedule. Complexity under these circumstances is due to the number and variety of the constraints. While automated constraint-based scheduling systems probably could manage these constraints, they require all knowledge a priori. Discussion then led to defining two dimensions of problem difficulty, complexity and perplexity. While complexity is associated with the problem size, perplexity pertains to the involved, intricate, or complicated conditions of circumstances. It was then argued that complexity and perplexity made it extremely difficult to produce computer-based methods for scheduling. However, human schedulers manage to produce workable schedules under such adverse conditions. By applying deep knowledge, they go beyond the shortcomings in directly available information. They can identify a system’s state when confronted with incomplete and ill-defined data. They use their experience to judge the adequacy of

81

information supplied. To complement inadequate information, they seek extra information from other sources or from their experience. Using intuition, they fill in the blanks about what is happening, and, what can and what will happen, on the floor. This includes sensory data and a mental model of the situation. They pursue goals without fully articulating them by following practices they believe to be good, from years of experience within the industry. Through their deep understanding of the domain, they may know when and how machines may be used in exceptional ways. Consequently, they may change constraints that otherwise would be inviolable. Scheduling by humans, indeed, avoids the need for all knowledge to be explicitly defined a priori. From the above argumentation, it is clear that a hybrid intelligent production scheduling system (HIPSS) should bring together the strengths of traditional operations research, knowledge-based systems and human approaches to scheduling while avoiding their weakness. It must accommodate the complexity and perplexity found in real manufacturing establishments. Before a system that engenders active engagement by humans in scheduling decisions, it is necessary to discuss how humans solve scheduling problems. Therefore, the next chapter focuses on how humans and computers interact in decision making.

Chapter 4 Human-Computer Interaction in Production Scheduling

How humans and computers can solve problems in consort is the subject of this chapter. In Chapter 2, a supervisory control model was used to describe the

82

process of controlling production. The operations that were ascribed to supervisory controllers are: 1. Goal-setting and planning; Human operator 2. Implementing a plan to reach a goal; 10 9 3. Monitoring the process; Displays Controls 4. Intervening when needed; 8 HIS Computer 5. Learning from the results of previous 7 actions. 2

Human interactive subsystem (HIS)

6

Semi-

Sheridan’s multi-loop model of supervisory automatic TIS Computer 3 subsystem control (Figure 21) was used as a schema to Sensors Actuators (TIS) 4 5 describe the features of a system that combines 1 Task human and machine intelligence. Hybrid intelligence appears in activities associated with loops 8, 9 and 10. To share decision Figure 21. Multi-loop model of supervisory control (Sheridan, making with the computer Sheridan’s human 1987). operator has to be able to interact with it. Symmetry exists in their partnership (Figure 22). The interactive process between the human supervisor and the Human Interactive System (HIS) is explored in this chapter. In a Hybrid Interactive System the computer helps humans in their decision making, in a way that is consistent with human cognitive Human Supervisor processing. The human may operate with quite different Presentation Controls constructs from the automated algorithms, but be required to monitor and/or take over from them under conditions of Presentation Controls uncertainty and rapid change. HIS Computer To construct an effective hybrid-intelligent production Figure 22. Human interactive scheduling system (HIPSS), one must first understand how subsystem (HIS). the human operator solves scheduling problems (Nakamura and Salvendy, 1987). Therefore, this chapter considers how human schedulers approach decision-making. Scheduling is placed within a systems-thinking context, in which schedulers are perceived to make decisions through rational action. The manufacturing system is shown to be a product of purposeful action in which each individual machine has a specialised purpose and the combination of

83

machines within the production shop have been purposefully chosen. Using means-ends relationships to model the manufacturing system, decision making in scheduling is shown to consist of recognition and action cycles that are linked within a “decision ladder.” Reasoning is classified into skill-, rule- and knowledge-based behaviour that are related to particular paths in the decisionladder. A specific form of the ladder called the Model Human Scheduler that represents decision activity in scheduling is then described. Schedulers’ perception of the system is shown to vary across a means-ends hierarchy and is dependent on the type of reasoning they invoke. This leads to a discussion on the development of an HIPSS in which activities are shared between human and computer in the decision-ladder. Finally, aspects of a computer interface that support schedulers in perceiving the system at different levels of a means-ends hierarchy are discussed in relation to surface features in the display. Subsequent chapters consider, the types of scheduling strategies that have to be accommodated (Chapter 5), the form of an HIPSS (Chapter 7).

4.1

Humans and the system

Ackoff (1979a) asserts that since its inception, during World War II, reductionism has dominated OR thinking. In Classical OR a machine analogy applies to the organisation of a factory. The organisation decomposes into basic indivisible component subtasks (Kerr, 1991). Each subtask is then analysed independently and the most efficient way of performing it is scientifically determined. In other words, the whole is taken apart to find out how each part works. The understanding of the parts is then assembled into an understanding of the whole. Research focuses on methods for efficient way of performing each subtask. Kerr calls for a paradigm shift to systems thinking to avoid the loss of essential holistic properties. A systems thinker sees each “object” as a part of one or larger wholes, instead of seeing it as a whole that has to be taken apart. While a reductionist sees a factory as the sum of its parts, a systems thinker’s gaze is outwards. The factory sits within a larger system, for example, the distribution system of the market. The systems thinker seeks to understand a system’s role or function within a larger containing system. Each organisation is a purposeful system containing

84

purposeful subsystems that have different roles or functions. Under systems thinking, the coordination and integration of different organisational activities are central (Ackoff and Emery, 1972; Beer, 1966, 1979; Churchman, 1971). Organisations, and their parts, have purposes of their own. They are also part of larger purposeful systems. A system may be an amalgam of persons and technology, each with their own goals. Overall system goals may then be hard to define. Goals that are divergent between persons lead to a divergence in purpose and therefore may produce ‘cognitive dissonance’ (Kerr, 1991). The interconnection of subsystems can be simplified through clustering or decoupling. Hierarchical organisations tend to cluster interactions within and between their subsystems. Departmental managers may insist that all communications within their department and between departments pass through them. To decouple subsystems, buffers may be introduced or lead times may include sufficient slack so that time variance is not passed on (see changeover of batches in Chapter 2). However, in small-batch manufacture, which is the focus of this study, social interaction tends to be high. The need for frequent human intervention can bring operators into extensive contact with a wide range of co-workers at all levels (Blumberg and Alber, 1982). Schedulers in practice cannot isolate themselves from other activities within the organisation. They have to cope with complex systems of changing problems that interact with each other. They manage messes, rather than solve problems (Ackoff, 1979a, 1979b). For a purposeful system in a dynamic environment, it is therefore pointless to seek optimal schedules. Scheduling from the systems perspective becomes a process of synthesis more than analysis.

4.1.1 Actions The actions of schedulers operating within perplex organisations cannot be mechanistic. The epistemological position is one in which schedulers are purposeful actors. In the world of manufacturing, seemingly dominated by rational purpose, Popper’s (1972, 1976) rational action model of human behaviour is appealing. From Popper’s perspective the working environment (the situation) itself is the source of rationality. It is taken as a social fact independent of any

85

actual human agent. Hence, Popper’s stance is positivist. Boothroyd (1978) provides a framework for analysing behaviour from an action model perspective. An action programme for articulated intervention has three types of elements: 1. Theories: passive and descriptive beliefs of how things are or might be. They imply what actions are to be considered, and what consequences are to be anticipated if an action is taken. 2. Proposals: active and prescriptive statements of what ought to be and hence promote action. Through proposals values and ethics enter the action programme. Proposals state how actions are to be evaluated and so lead to the action that is to be taken. 3. Actions: result in the modification, deletion or addition of proposals and theories. Gault (1984) uses Boothroyd’s action programme to appraise the classical (Technical) OR scheduling perspective and to put forward an alternative, the Social OR action programme. In a Technical OR action programme, the core theory is positivist: actions are based on prediction. The core proposal is the maximisation of a criterion function. The core actions emphasise objectivity and are quantifiable. However, human actors, free to attribute meaning to what they perceive, organise manufacturing activity (Checkland, 1981). In Chapter 3, the meaning attributable to due date was shown to depend upon the actor’s perspective. Early in the century, Weber introduced a means-end scheme of rationality to explain social activity (Brubaker, 1984; Weber, 1962). This action approach gives primacy to the individual actors who pursue their own ends and in so doing create social reality as a process. The nature and content of the individuals’ thinking about the world are central, in contrast to the world of positivism, which is independent of all observers (Husserl, 1936). It is compatible with critical theory of the Frankfurt School. Its leading postwar theoretician, Habermas (1971) distinguishes between purposive-rational action and interaction. Purposive-rational action is either instrumental action or rational choice, or their conjunction. Technical rules based on empirical knowledge govern instrumental action. Actors make conditional predictions about observable events, which may be either physical or social. Strategies are based on analytic knowledge that govern rational choice. Actors

86

make deductions from preference rules (value systems) and decision procedures. Therefore, under the given conditions defined goals are realised through purposive-rational action. Instrumental action organises the means, which are either appropriate or inappropriate, according to criteria that defines an effective control of reality. Habermas sees purposive-rational action, founded on positivist natural science, governing human discourse through technical rules based on empirical knowledge. However, social norms also govern human discourse. In this form of interaction, communicative action depends on shared symbolic meanings. The discourse is bound by consensual norms that define reciprocal expectations about behaviour. Interaction therefore relies on a mutual understanding of intentions. Schedulers are subjected to the social dynamics operating within the organisation. The system of people has rich and assorted, intrinsic and extrinsic, values and goals. Persons within the system have multiple relationships and roles. Their number, their relationships and their individual goals are even subject to change (Gault, 1984). Scheduling is therefore a social activity. The core of Gault’s theory of Social OR action is the premise that organisations are inhabited by purposeful beings. The focus of interest is on the actions of people (actors). The core proposal is the provision of the means to assist actors such that the quality of their actions improves. The goal is quality, not efficiency, as the complex array of consequences associated with any action includes many which are subjectively experienced and not amenable to quantification. While the elements of the action programmes are inherently dynamic, some core proposals and theories are difficult to change. At any instant, some theories and proposals will be explicit, but the majority are latent (the actor, while not immediately conscious of these, can summon them as required). There will also be some implicit (hidden, subconscious) theories and proposals.

4.1.2 Modelling Purposeful Action Schedulers are usually practical persons who have a keen understanding of their resources: the capabilities of machines and work practices. Their competence comes from a deep understanding of the work domain, evolved through

87

experience gained under a variety of circumstances (Brödner, 1990; Dutton and Starbuck, 1971). Like other shop-floor artificers, they acquire much of their knowledge by the solving of pressing problems, often under temporal stress (De Montmollin and De Keyser, 1986). They learn to deal with a vast array of factors that arise in the working environment. These may be unpredictable and not easily placed into a theoretical context. Through their actions, they build up their knowledge of the domain: the knowledge is built by, and for, the action.18 Having decided upon a goal to achieve, schedulers can be understood to form intentions, make plans and carry out actions (Norman, 1986). Their intentions derive from their value structures and internal goals (Rasmussen, 1985). Seemingly, a scheduler’s behaviour is volitional. Yet external constraints place boundaries on decision choice. For example, a specific operation can only be performed on particular machines. Purposive human behaviour must be based on an internal representation of the system constraints (Rasmussen, 1983). Often a decision-maker uses the internal representation without attention to its referents. Rasmussen found that even for processes that can be readily analysed, processcontrol operators tend to explain a system’s behaviour in terms of a presumed functional structure, instead of collecting information on the actual physical structure: that is, instead of checking the actual system, they rely on mental simulation. To know how to shape a scheduling system, in which humans actively partake in the making of decisions, we need to know how humans perceive the scheduling process. How do they interpret signals from the environment and work out appropriate actions (Green, 1990)? What mechanisms do they apply in generating descriptions of system purpose and form, explanations of system functioning and observed system states, and predictions of future system states (Rouse and Morris,

18

In the study by Dutton and Starbuck (1971), Charlie found the numerous rules he applied in estimating run time by meticulously examining the configuration of the manufacturing system for different jobs. He memorised the associations between production speed and the scheduling characteristics. He then used this mental look-up table to plan his scheduling actions.

88

1986)? Obtaining answers to these questions depends on a formal language that describes human decision-making processes in supervisory control of manufacturing systems. A formalised description requires a systems-oriented method of analysis that encompasses both the engineering system and the problem-solving operations of the human decision-maker.

4.2

Cognitive Work Analysis

Cognitive Work Analysis (CWA) is a systems-based approach to the analysis, design, and evaluation of human-computer interactive systems. Sanderson (1998) states that it is a form of analysis that is neither normative (actively dictating how activity should proceed) nor descriptive (passively describing existing activity). Instead, CWA is “formative,” as it provides a means for designing an HIPSS that leads the human scheduler to the most effective behaviour. The aim is to develop a system that supports human activity in situations previously not encountered. (Sanderson, 1998). The principal components of CWA are work domain analysis (WDA) and activity analysis (AA). WDA is event independent and is quite separate from a subsequent event dependent analysis of the activity that takes place within a work domain. WDA represents the “ecology” in which activity can take place (Sanderson, 1998). It describes the operational constraints of the physical system. For a nuclear power plant (NPP) the laws of thermodynamics, inter alia, act as constraints on the behaviour of the system. The AA describes the activity itself that must be within the bounds of the physical system. Activity Analysis is therefore event-dependent. Rasmussen et al. (1994) describe activity in work domain terms, decision terms, and in terms of mental strategies. The focus in CWA is on psychological products rather than mental processes, which is the subject of psychological models (Sanderson and Harwood, 1988). It also incorporates and extends many important features of current cognitive theories (Reason, 1988). This shift away from psychological models of human behaviour to a conceptual framework is critical. It provides a classificatory framework and a language for integrating research from diverse areas (e.g.,

89

display research, artificial intelligence, cognitive science, etc.) each of which applies its methods (Moray, 1988, Goodstein, 1981). CWA has been applied to a variety of systems such as continuous process control, emergency management, office automation, and library retrieval (Rasmussen, 1986). Rasmussen has placed such systems on a continuum that ranges from systems that are strongly constrained by the laws of physics (such as power generation and chemical process control) to systems that are strongly determined by human intentions (such as libraries and military command and control systems). The formation of schedules is clearly of the latter class as they are determined by human intentions. The factors that affect the behaviour of an individual human scheduler within a system range from the physical and engineering constraints of the work domain itself, to the cognitive, perceptual, and affective properties of the individual actor (Sanderson, 1998).

4.2.1 Work Domain Analysis Being products of human endeavour, manufacturing systems are purpose directed. Rasmussen (1983) asserts that teleological explanations of the functions of engineered systems, which are derived from their ultimate purpose, are as important as causal explanations based on engineering analysis. To control a manufacturing process, supervisory controllers may consider its functional properties. They can then act to control these properties to meet an ultimate purpose. Their actions may be in direct response to the system’s state (causality) or they may use their experience to reason from abstraction (intentionality). Rasmussen uses a means-ends hierarchy to depict the physical system from different levels of abstraction (Figure 23). WDA makes use of an abstraction hierarchy (AH) to describe a system in languages that distinguish its purposive and physical aspects. The descriptive language varies with the levels of abstraction. Together the descriptions elucidate why the system exists, the priorities and values embedded in it, its functions, and its physical make up (Sanderson, 1998). The lowest level of abstraction represents only the physical configuration of objects and their locations. The next highest level represents the

90

physical processes or functions of the various components and systems in a language related to their specific electrical, chemical, or mechanical properties. Above this, the functional properties are represented using more general concepts, without reference to the physical process or equipment, by which the functions are implemented. The levels of abstraction describe the means-ends relations. The formation of the abstraction hierarchy is one part of WDA. The other dimension of WDA is decomposition, which depicts part-whole relations. Elements of a systems represented within a level of abstraction often functionally decompose into smaller parts. MEANS-ENDS RELATIONS

PROPERTIES REPRESENTED Purpose-Based Properties and Reasons for Proper Functions are Propagating Top-Down

Purposes and Values; Constraints Posed by environment Priority measures; flow of Mass, Energy, Information, People, and Monetary value General Work Activities and Functions Specific Work Processes and Physical Processes Equipment Appearance, Location and Configuration of Material Objects

Causal Constraints

Intentional Constraints

Physics-Based Properties and Causes of Malfunction are Propagating Bottom-Up

Figure 23. Means-Ends Abstraction Hierarchy: Functional properties of a physical system designed to serve human purposes, described at several levels of abstraction (Rasmussen and Pejtersen, 1995).

As the method has matured, the names for the levels have changed to those shown in Figure 24 for a heat exchanger.19 The nodes at each level represent the relevant purposes, priorities, functions, and objects. For a node at a given level, the parent nodes tell “why” the function exists, whereas child nodes describe “how” the function has been engineered (Sanderson, 1998). The top three levels show how

19

The original names, from top to bottom, were Functional Purpose, Abstract Function, Generalised Function, Physical Function and Physical Form. Both priorities and values, and, purpose-related functions more clearly express the meaning that Rasmussen attributes to abstract and generalised functions. Physical form inadequately conveys the full import of physical device or physical object as it tends to delimit the meaning to spatial characteristics.

91

general purpose-related functions are chosen, given the overall functional purpose of the system and its priorities and values. The lower two levels show the system as an array of physical objects that have their own functions. Functional purpose

Efficient energy transfer

Priority/Values

Principles of heat transfer

Physical function

Physical device

Heat transfer secondary

Heat transfer primary

Purpose-related function

Temperature 1 in

Thermocouple 1 in

Temperature 1 out

Thermocouple 1 out

Flow rate 1

Flow meter 1

Primary circuit

Temperature 2 in

Temperature 2 out

Thermocouple 2 in

Thermocouple 2 out

Flow rate 2

Flow meter 2

Secondary circuit

Figure 24. State variables associated with a heat exchanger at different levels of abstraction.

The links in the WDA for the heat exchanger show how to instantiate the design from functional purpose to physical devices. The functional purpose of designing a heat exchanger is the efficient transfer of heat from one fluid to another. The levels of priorities or values state the intentional constraints: Through the application of the physical principles of heat transfer, the designer places constraints on the behaviour of the system. At the level of purpose, the transference of heat occurs in the primary and secondary circuits. The lower levels show the physical functions and the devices. At the physical level, there are tubes, fluids and sensors. The sensors measure the appropriate state variables. The nodes and links are clear and unambiguous.

4.2.2 Activity Analysis Activities can be analysed in both work domain and decision terms. They can be displayed in work domain terms by superimposing each activity on the abstraction

92

hierarchy (AH) of the WDA. To analyse activity in decision terms, Rasmussen (1983) classified reasoning behaviour in supervisory control into three distinct types: skill-based, rule-based, and knowledge-based reasoning. When a person recognises features relevant to his/her immediate needs and goals, his/her behaviour may be skill-based (Rasmussen, 1990). It rolls along without the person’s conscious attention. Rule-based behaviour consists of a sequence of subroutines in a familiar work situation, which the person consciously controls (Rasmussen 1990). It is directed towards the meeting of goals. Very often, the goal is not even explicitly formulated, but found implicitly in the situation releasing the stored rules. The boundary between skill-based and rule-based performance is not well defined, as it depends on the level of training and on the person’s focus of attention. Their difference can be shown using an example where the two types of behaviour interfere. An experienced American driver hires a car on his/her arrival in Britain. Being unaccustomed to driving on the left, he/she has to suppress inappropriate skill-based behaviour and operate at the level of rule-based behaviour.20 Skill-based and rule-based behaviour are purposive-rational action of instrumental form, whereas Habermas refers to knowledge-based behaviour as rational choice. Knowledge-based reasoning is used during unfamiliar situations for which no rules for control are available. A mental model, the composition of which is discussed later in this chapter, represents the internal structure of the system. Operating under this mode of reasoning, a person tests different plans against explicitly formulated goals, which derive from analysis of the environment and the overall aims of the person. Testing may be either through physical trial and error, or conceptually. For the latter, the person uses their perception of the functional properties of the environment to predict the effects of the considered plan. Table 2 summarises the representations of the different types of reasoning and the processes a person undertakes when reasoning.

20

Making a right-hand turn in the UK (equivalent to a left turn in the USA) and changing lanes on a freeway are especially difficult.

93

Table 2. Summary of the basis for cognitive control. Schematic illustration of the representations of the regularities behind the behaviour of the environment that are used for control of behaviour (Rasmussen, 1990). BEHAVIOUR

REPRESENTATION OF THE

PROCESS RULES

PROBLEM KNOWLEDGE-BASED

RULE-BASED

SKILL-BASED

mental models, explicit representation of relational structures; part-whole, meansend, causal, generic, episodic, etc., relation

heuristics and rules for model creation and transformation; mapping between abstraction levels; heuristics for thought experiments

implicit in terms of cue-action mapping; black-box action-response models

situation related rules for operation on the task environment, i.e., on its physical or symbolic objects

internal dynamic world model representing the behaviour of the environment and the body in real time

not relevant – an active simulation model is controlled by laws of nature; not by rule

There is a correlation between reasoning behaviour and the form of information processing. In skill-based reasoning, the sensed information is perceived as signals from the environment. At the rule-based level, sensed information is perceived as signs that activate stored (stereotypical) patterns of behaviour.21 That is, cue-patterns in the environment trigger a stored set of activities. At the knowledge-based level symbols apply.22 Rasmussen’s (1990) SRK (Skill, Rules and Knowledge) framework for decision-making consists of cyclic activities of recognition and action. Rasmussen (1976) developed a schematic map of the sequence of mental activities of human operators, in what has come to be known as Rasmussen’s Decision Ladder, shown in Figure 25. On detecting a need for data processing, the human responds with a series of recognition-action cycles to analyse the situation in relation to goals: a traversal up the left side of the ladder.

21

From a behavioural perspective, a sign is a preparatory-stimulus that causes a certain response sequence in a person (Morris, 1946). 22

Symbols are abstract constructs related to and defined by a formal structure of relations and processes, which through conventions can be related to features of the external world. That is, they represent other information, variables, relations and properties, and can be formally processed (Rasmussen, 1978b; 1990).

94

Having analysed the situation, the human uses a series of recognition-action cycles to plan appropriate actions on the system. Experienced supervisors would not have to pass through each step of the process, but would shortcut from the left to the right sides of the ladder. These shortcuts represent rule-based behaviour. Decision-making processes can be analysed using the SRK framework and the decision ladder. Tools such as Card, Moran and Newell’s (1983) GOMS (Goal Operator Method Selection rule) can be used to predict the mental workload for critical activities (Moray et al., 1988). Decision aids to support some recognitionaction cycles can then be designed (Sheridan, 1987; Paradies, 1985). From the SRK framework, designers can consider when and where in a display the information acts as a signal, sign or symbol. The focus is on the operator’s mental context, as signals, signs and symbols are products of mental context. While the surface features may not change, their interpretation changes with mental context. These ideas will be shown to be crucial in designing an HIPSS.

95

KNOWLEDGEBASED PLANNING

KNOWLEDGEBASED ANALYSIS

RULE-BASED SHORT-CUTS

Figure 25. Schematic map of the sequence of information processes involved in a control decision (Rasmussen and Goodstein, 1986).23

Lind (1991) states that most people would intuitively understand the meaning of Rasmussen’s decision ladder because it has so many possible interpretations: yet, it is difficult to distinguish between what is actually represented and what is inferred or implied by a person looking at it. He considers it an insufficient representation when used beyond its original purpose of describing supervisory control of continuous industrial processes.24 For example, when there are

23

The early form of this map (Rasmussen, 1976) does not include the arrows showing the direction of knowledge-based analysis and planning and the labelling of rule-based shortcuts. 24

While Rasmussen and Jensen’s (1974) study of electronic troubleshooting does not mention either the Abstraction Hierarchy or the decision ladder, in this study are the foundations on which Rasmussen formed these concepts for studying 96

competing hypotheses, interpretation is ambiguous. The Ladder also presumes that tasks are solved. However, if they are not solved more information on which to make a decision may need to be collected. This would involve other tasks that are not in the direct sequence. Lind also demonstrates that the model may sometimes not show all the states of knowledge that affect data-processing activities. For example, the DEFINE TASK sub-task may require both the system state and the target state as input information (Figure 25). Where the decisionmaker’s choice of information-processing strategy may depend upon current knowledge, each decision task would have to be considered as a task category (or class) that has different instantiations, corresponding to different information processes. Lind therefore warns that care has to be taken in using it as a formalised model for explicitly representing knowledge and for controlling the problem solving in knowledge-based systems. Although Lind is critical of some aspects of Rasmussen’s Ladder, he accepts its basic structure and he was party to its development.25 Lind (1988) argues that the means-ends abstraction hierarchy brings a necessary formalism to the systematic design of information systems for operator support. Displays can then represent system relations (means-ends), instead of the customary system-elements (partswhole) view.26

mental procedures in real life tasks. Nevertheless, it is evident from Rasmussen (1974) that the modelling of human operators in process control was of primary interest. 25

He worked with Rasmussen at Risø from 1970 to 1985.

26

The part-whole dimension delimits the section of the problem environment that is within the span of attention (Rasmussen, 1990). In the example on electric power generation, that is discussed later in the chapter, the physical parts (nuclear reactor, heat exchanger, steam generator, turbine, etc.) are apparent. The meansend dimension specifies the level of generality at which the problem is considered. The inlet and outlet temperatures of the heat exchanger are specific measures, 97

4.2.3 Human Supervisory Control in Continuous Process Control The process control domain is highly automated. When operating normally, the industrial process is self-regulating. The function of the human is to supervise the process as shown in Figure 21 (Sheridan, 1987). As supervisory controllers, humans are then outside the primary control loop. They set the initial conditions, make adjustments intermittently, and receive information about the process. They monitor the process for abnormal behaviour. Upon observing unsatisfactory performance, they troubleshoot the problem and then intervene to bring the system back to the desired state. As long as the system operates in fully automated mode and meets its functional purpose, it operates without human intervention. In such cases, it is straightforward to represent the system using the WDA and the decision activities of the humans supervising the system using AA. Rasmussen’s (1986) ‘decision ladder’ can readily represent the monitoring, troubleshooting and intervening activities of the supervisory controller. Activities represented by the decision ladder are directed towards known goals. The decision ladder in Figure 26 shows the activities for troubleshooting problems that arise with normal operation of the heat exchanger. In the example, a low output from thermocouple 2 (see left side of Figure 26) can trigger the task “check the thermocouple electrical circuit” (see right side of Figure 26). This requires planning the procedure and then executing it. However, if the problem has arisen before, the troubleshooter can sidestep the planning phase, as the appropriate procedure is already known, and move directly to the procedure. The decision to check the electrical circuit depends upon previous experience of this type of problem. If the supervisory controller had not come across this situation before (or if on checking the electrical circuit it had been found to be functioning normally), he/she would return to the left side of the ladder and move upwards to identify more fully the state of the system. That is, decision-making moves to observation at a higher level of abstraction: the flow of energy in the primary and secondary circuits. A well-designed computer interface, therefore, must depict the different

whereas mathematical formulae describing the heat transfer are general and abstract.

98

levels of abstraction shown in the WDA in a way that supports all the activities shown in the decision ladder. The WDA of an automated system taken from a designer’s perspective may be quite different from the operator’s or user’s perspective. This can be seen when the operator uses a servomechanism, which is also an automated process (Higgins, 1998). This will be discussed in some detail below using an example of a servomechanism is the power steering mechanism of a car (see Figure 27). Decision making by supervisory controllers goes beyond monitoring and troubleshooting. As discussed in Chapter 2, they also: 1. Set goals and plan future activities; 2. Implement plans; 3. Learn from the results of previous actions. Setting goals and planning are anticipatory activities that precede implementation and therefore do not have to be responsive to immediate control requirements. As these activities are different from monitoring and troubleshooting, the decision ladders would differ. There is no definitive action sequence in the decision ladder for formulating goals. When setting goals and formulating plans, the supervisory controller is not responding directly to alerts generated by the dynamics of the self-regulating process. How such activities should be represented is the subject of an AA of scheduling.

99

Goal: Maintain heat transfer

Evaluate options OPTIONS

EFFECTS Predict consequences

Actual: energy in & energy out

Desired: energy in = energy out

Identify state

e.g., t thermocouple 2 has unusually low reading

e.g., check thermocouple electrical circuit

Choose task

INFORMATION

TASK

Observe data/evidence

e.g., recall procedure for checking thermocouple electrical circuit

Plan

e.g., execute procedure for circuit check

PROCEDURE

Alert:

Activation

Execute

Figure 26. The decision-making activity of the supervisory controller in monitoring the heat exchanger.

In the process industry, the WDA that describes the system is founded on the physical laws constraining the system: for the heat exchanger, they are the principles of fluid flow and heat transfer. The WDA describes the affordance structure for the activities portrayed by the AA. As the function of the supervisory controllers is to monitor the process and to intervene when its behaviour is aberrant, the appropriate WDA must describe the physical operation of each component in the system. Booster Output

e(t)

Angle Steering wheel

+ –

Steering booster

Front wheel angle

Steering geometry

Figure 27. Schematic representation of the operation of power steering in a car.

Returning to the driving example, the driver uses the steering wheel to input the desired angle of the wheels. The servomechanism turns the wheels until the desired angle is reached. By setting the value of the input — the angle that the

100

steering wheel is turned — the driver supervises the direction that the car goes. The goal of the designer of the mechanism is to make the dynamic characteristics of the servomechanism to behave as a linear device within particular time constraints that the device would used. However, the goal in using the steering mechanism, when considered as part of the broader system of the car and driver, is something else. As the car travels along the road, the driver’s goal may be to keep the car travelling within a particular lane. While the mechanism is operating correctly, all the user needs to know is that it affords steering. Cognitive work analysis of the driver does not have to consider the same WDA of the steering mechanism that a designer or repairer would navigate through. The WDA for the driver maps the affordances at different levels of abstraction. Its functional purpose is steering. Its physical function is to amplify the torque of the driver while maintaining a linear relationship between the angle that the steering wheel is turned and the angle that the front wheels of the car move. The form of the WDA thus depends on the activity analysis of the task of driving. The above discussion shows that there can be different WDAs for the one physical system. Explanation in terms of the grain of analysis may seem enticing; however, this ignores the fact that a WDA is constructed relative to the user. It describes the system in terms of affordances for the activities of the user. Hence, WDA depends upon the AA that it supports. Recognition of relevant affordances depends upon the users being attuned to constraints that support their activities.27 However, the type of situation determines the constraints to which the user is attuned (Edlund, Weise and Lewis, 1995). This will become evident in the following discussion on production scheduling.

27

Benda and Sanderson (1998) and Rasmussen (1998a) have explored the idea of different stakeholder domains or object worlds. WDA is constructed relative to the purpose of analysis.

101

4.2.4 Model Human Scheduler Sanderson and Moray (1990) wanted to develop a model of human scheduling behaviour — the Model Human Scheduler (MHS) — to gain an understanding of the subjective parameters that influence a human’s ability in undertaking scheduling routines. By focussing on psychological questions (e.g., how do humans service tasks under time pressure?) they aimed to predict how well “unaided” human schedulers would perform in a particular plant. A successful MHS would then serve as a basis for FMS system design, for designing on-line decision support systems, and for developing FMS operator training guidelines. They rejected the GOMS approach to modelling scheduling behaviour as: 1. It has limited usefulness when there is not one best way to perform a task, which is the case for scheduling; 2. It is not useful where behaviour is error prone, as errors are unexplained deviations from a path towards a goal; 3. It cannot handle interruptions to the user. To provide a control structure of recognition and action cycles Sanderson (1991) reworked Rasmussen’s decision framework to place it within a scheduling context (see Figure 28). To express the human’s behaviour she used the formal notation of

production system architecture as used in AI (Newell and Simon 1972; Neches, Langley, and Klahr, 1987; Kieras and Polson’s, 1983, 1985).28 IF-THEN production rules represent the actions shown as arcs that connect the dataprocessing activities and states of knowledge. The production rules do not

28

The meaning of “production” as used here is not to be confused with the same term used in the manufacture of goods. See Eberts (1993) for a discussion of the development of the production system model for HCI. Bovair, Kieras and Polson (1990) refined the model by representing the production system using the PPS (Parsimonious Production System) Rule Notation that was developed earlier by Kieras. In 1988, Newman used the antecedent-consequent framework of production rules to describe heuristics representing the experience and intuition used by an expert scheduler to generate and revise schedules.

102

necessarily reflect the way humans internally solve scheduling problems. They merely provide a means for analysing the type of information that humans must engage. By designing the MHS as a production system described by Neches, Langley, and Klahr (1987), Sanderson separated the production memory, containing all the IF-THEN rules in the production system, from the

Information processing activities States of knowledge resulting from informat ion processing Evaluate performance criteria

10

What are we aiming for?

Knowledge-based domain

6

Uncertain

Performance goal

5

11

What are the consequences, given goal?

Determine state consequences

What is the effect on this state?

5

9 9a

KNOWLEDGE BASED ANALYS IS

State 8

4 Identify state

What do these data signify?

8c

9c

9b

Choose criterion

KNOWLEDGE BASED PLANNING

12

8a 8b

Criterion

13 7c

consequences

State consequences

8d

What policy satisfies criterion?

Define policy:

operationalise criterion working memory, 14 Data which holds the 7 7b Policy 7a 3 current goals and How to carry it out? 15 Observe information and Determine steps: currently apprehended data operationalise policy What is Rule -based domain 2 happening? 16 data. A recognise-act Steps Alert 1 cycle then consists of 17 Carry out Activate three processes. A Skill-base d domain match process finds the best match Figure 28. Model Human Scheduler (Sanderson, 1991). between the current contents of working memory and the individual productions in production memory. A conflict-resolution process determines which production rule to apply. An action process carries out the chosen production rule and thereby changes the contents of the working memory.

The information-processing activities shown in Figure 28 represent the temporary goals held in working memory and the states of knowledge represent the apprehended data. Production rules can be divided into those dealing with recognition, decision, and action. Movement from information-processing activities to states of knowledge on the left-hand side is recognition, or

103

knowledge-based analysis. Movement from states of knowledge on the left-hand side and top to information-processing behaviour, or to further states of knowledge, is decision, or rule-based behaviour (except for the bottom-most arc, which represents skill-based behaviour). Movement down the right-hand side is knowledge-based or rule-based planning activity. Sanderson uses Card, Moran, and Newell’s (1983) Model Human Processor (MHP) to provide a computational account of production rule 16 (see arc labelled 16 in Figure 28), which represents only a small part of the cognitive activity described by the MHS. The MHP was developed as an engineering approximation of human information processing from which constraints and properties of human-computer interfaces can be derived (Card, 1984). It encapsulates much of what is known about human information processing constraints in a specific processing architecture, which incorporates sensory, cognitive, and effector

processors, along with a working memory and a long-term memory (Barnard, 1987). Each processor has an estimated cycle time (Card, 1984; Newell and Simon, 1972). Memories have a given capacity, modality, decay rate, and susceptibility to interference. By the time production rule 16 fires, the scheduler has already chosen a scheduling policy. Sanderson considered, as a “thought” experiment, the cognitive activities required to order four jobs using the Moore-Hodgson algorithm.29 She assumed that the scheduler would order the jobs in his/her head without using external aids such as pencil and paper. Sanderson considered a typical set of mental processors that a scheduler may apply. She estimated the time the scheduler would take to plan the sequence of steps and the load that this planning would impose on working memory. These estimates were obtained using standard

29

1. 2. 3. 4.

Moore-Hodgson rule has the following steps: Use EDD to order jobs. Find the first job in the sequence that will not be completed by its due date. Remove the longest job prior to the job mentioned in step 2. Place it at the end of the sequence. Go to step 2.

104

elements for perceptual processing, cognitive processing, eye movement and articulatory rehearsal of letters. Scheduling these four jobs required 75 elemental processes (35 of which were used to just arrange the jobs in EDD order). By aggregating the times for firing of the processors, Sanderson estimated the total cognitive-processing time to make this scheduling decision as 13 seconds. The MHP analysis showed that at times the scheduler would hold five or six items in working memory, which normally has a limit of 4.2 chunks. At these points in the sequence, where there are very high demands on the scheduler’s working memory, errors are most likely to occur. Experienced humans have ways for coping with such demands. They may, for example, hold information in different modalities in working memory (visual versus auditory), move some material to Long-Term Memory (LTM), or even “position” items at different subjective locations in the mind. These findings, Sanderson cautioned, are for very particular conditions. The MHP times assume that the scheduler has moderate experience and that he/she has learnt to avoid excessive memory load by a regular and repetitive pattern of eyemovements over the table of scheduling data. Sanderson also assumed that the scheduler remembers the steps of the policy perfectly and that no paper and pencil aids are used, that the operator is not under undue time pressure, and that the algorithm is executed without error. She only examined a single set of elemental cognitive processes a human may apply in executing the Moore-Hodgson algorithm. Cognitive processes, however, may vary across individuals according to their memory capacity and memory strategies, their level of expertise, the spatial layout of information on the control panel, and the form of representation (visual, graphic, etc.) of data. Most importantly, the nature of the display affects a scheduler’s strategy in performing a task. Sanderson and Moray (1990) refer to an experiment in which schedulers were required to perform under time pressure (Moray, Dessouky, Kijowski, and Adapathya, 1990). The distance from the goal was directly visible in the graphical display. Subjects, restrained in their deployment of the scheduling rules due to time restrictions, achieved better performance by resorting to their “intuition”, especially when the rule was difficult. They substituted perception for reasoning. From this experiment, Sanderson and Moray inferred that a great deal of assistance in the form of decision aids is needed for dealing with scheduling under pressure.

105

In Sanderson’s analysis of production rule 16, the scheduler has to hold five items in working memory just for the first stage of the Moore-Hodgson algorithm, which is merely a simple EDD ordering of four jobs. Clearly, even for the simplest of algorithms when there are more than a few jobs, humans find scheduling extremely cognitively demanding if they cannot use aids. However, in practice, schedulers use pencil and paper, calculators, machine loading boards and/or Gantt charts. Nevertheless, the analysis of an unaided scheduler’s behaviour clarifies the type of cognitive support a particular aid may need to extend to its user.

4.2.4.1 Problem Recognition The MHS represents more than the cognitive demands on a human executing the steps of a scheduling policy. It depicts a scheduler’s behaviour from the initial alert to the final scheduling decision. The formulation of the problem from the initial alert is a crucial activity. A scheduler, on being alerted to the need to act, starts scheduling activity. Production rule 1 represents the immediate, automatic response to the alert. If the scheduler does not respond in an automated way to the alert, he/she sets as a temporary goal the observation of information and data (production rule 2). Having observation as the goal, the scheduler then acquires data regarding jobs and the state of the system (production rule 3). The acquisition process involves sensation, attention, perception, and pattern recognition (Sage, 1987). It is affected by biasing factors such as the scheduler’s scanning habits, the form of the display and the current scheduling policy (Sanderson, 1990). The latter may latch the scheduler's attention to information relating to the current policy at the expense of other information. To recognise mappings between data and scheduling policies, schedulers match the pattern of the current situation to one of many archetypal situations stored in their memory (Simon, 1991). Their recognition of a pattern within data depends upon their domain-specific knowledge. Alternative production rules cover various modus operandi. If the scheduler finds a suitable mapping between the data and a scheduling policy, then he/she carries out the policy. Restating, the scheduler clearly sees which policy to apply on observing the details (i.e., the attributes) for jobs to be scheduled. Where the scheduler is familiar with the policy and the steps 106

it invokes, production rule 7a represents the decision process. Otherwise, the scheduler has to work out the operational steps for the policy (production rule 7b). Where the situation is unclear, there is not a suitable mapping between the pattern in the data and a scheduling policy. The scheduler has to identify the significance of the data (production rule 7c). The scheduler analyses the data for significance (production rule 4). This depends upon factors such as observability of the data, operator knowledge and familiarity and the time available for analysis. At this level in the decision ladder, knowledge-based behaviour is conspicuous. In analysing the data to find significance, schedulers seek to identify a particular system state from a system configuration by: 1. Looking-up factual knowledge in their memory; 2. Applying causal reasoning; 3. Generating alternative hypotheses and choosing between them. These, and other, issues of knowledge-based behaviour are discussed in section 4.3.2 “Representation in the Mind”.

4.2.4.2 Comprehensiveness While production rule 16 makes the scheduling policy operational, other production rules in the MHS tend to be far more subjective and hence their composition is much more problematic. For Sanderson and Moray (1990) subjectivity is a function of the interaction between a human and the scheduling situation. Different subjective factors influence recognition, the formation of decisions and the execution of actions. A scheduler’s process of recognition may be affected by the time available, system observability, the number of features he/she has to contemplate simultaneously, uncertainty in the system’s state, familiarity of the system’s state, conformity of the system’s state to a known pattern or schema, and the ease of projecting to future states. A scheduler’s decisions may be influenced by the number and similarity of alternative actions, his/her capacity to hold alternatives in working memory, the presence in LTM of previous successful decisions, and the complexity of judging whether a possible decision will satisfy system goals. In the execution of actions, the time available for action, the number of sub-actions in the plan, their familiarity, changes in

107

system state during actions and the potential for new events interfering with memory are some factors that may affect a scheduler’s behaviour. In the MHS, each production rule has a template that provides structural relationships while avoiding the problems of detail.30 These templates are generic and are not specific to a particular scheduling environment. The extent that a researcher has to draft the production rules in detail depends upon the research questions being investigated. When a researcher comes to detailing a rule for a specific scheduling environment, he/she has to consider behavioural aspects that range from the quantitative to the qualitative and from the objective to the subjective. Lind’s (1991) remarks regarding missing arcs in Rasmussen’s decision ladder are also pertinent to the MHS. While acknowledging the possibility of arcs other than those shown in Figure 28, Sanderson’s position is pragmatic. The MHS is an engineering approximation, although guided by a rigorous theoretical construct. Sanderson does not claim that it is a complete model of human behaviour. Therefore, she would only consider the inclusion of other arcs if empirical studies confirm their necessity.

4.2.5 Multiple Mappings in Recognitional Decisions In the above discussion on rule-based behaviour, only a single mapping between data and a rule was considered. In practice, there may be more cue patterns that the scheduler could recognise. Instead of acting on the first pattern observed, a scheduler acting proficiently would be on the alert for other cue patterns. The strengths and weaknesses of the various options are never compared (Klein, 1989). However, they are considered roughly in the order of most to least typical.

30

For example, the template for production rule 3 is: if goal = observe data then ... data = f(data available, scanning habits, current scheduling policy, hypotheses, incoming events, time to observe, etc.) ... then data = {d1, d2, ...} 108

This fits with Simon’s notion (1955) of satisficing, which describes the means of quickly and efficiently finding an effective option. The scheduler evaluates the available options, one at a time, until he/she finds one that is satisfactory. This rule is then used in drafting the schedule. Klein extends the recognitional approach from rule-based to knowledge-based decisions. This mode is less taxing of mental effort than the analytical mode. Because of their experience, experts can quickly generate plausible action, rather than generating a complete set of possible alternatives, as analytical decision making schemes require. Schedulers in their endeavour to produce suitable schedules, reflect on plausible goals, critical cues, expectancies, and typical actions (Klein, 1989). For the simplest cases, they immediately know what to do on observing the properties of the jobs and the state of the shop. Otherwise, they consciously evaluate the likely reactions to conceivable actions. Evaluation does not entail testing all possibilities, as this could become quite tedious. Instead, the scheduler looks for potential problems by mentally simulating all the factors that could come to play. Mental simulation enables a skilled performer to be alert to important flaws in a plan without having to examine everything, and without having to decide what to examine and what to ignore (which entails first examining everything). Proficient schedulers, being mindful of shortcomings in their assessments, remain on the alert for cues that show that their expectations have been violated, anticipated events have not occurred, or unanticipated events have occurred. Klein’s (1989) Recognition-Primed Decision (RPD) model advances another perspective on decision making. Serial evaluation marks its departure from behavioural decision theory. Within his schema, schedulers act like proficient decision-makers. They become aware of events that have occurred, recognising typical situations and the ways to respond. They evaluate possible responses one at a time. In trying to anticipate what would happen if they carry out a specific action, they imagine its execution in the specific working environment. Note that generally these decisions do not come about by generating all possible options and then comparing their strengths and weaknesses. Where there are similar mappings, the scheduler may see the signs that satisfy some or all of the conditional aspects of an appropriate rule. However, they may not be attentive to 109

countersigns that show that a mapping is inapplicable (Reason, 1990). They may also see a mapping to a rule that is familiar but is wrong for the current situation: Reason’s strong-but-now-wrong rule.

4.3

The MHS and Hybrid Human-Computer Decision Making

The MHS representation of scheduling behaviour and Sheridan’s supervisorycontrol model, discussed in Chapter 2, are both broad constructs not confined to a single methodological representation. In combination, they provide a framework for modelling an HIPSS. The MHS provides a structural relationship between the different modes of supervisory control: between setting goals, forming and implementing plans to reach the goals, monitoring the process, intervening when necessary, and learning from the results of previous actions. A production rule connects an information processing activity with a state of knowledge, or a state of knowledge with either an information processing activity or another state of knowledge. In a hybrid decision-making system, either the human or computer acting alone can produce the consequent action of the production rule. Alternatively, they can act together. While primary decision-making activities may rest with either the human or computer, responsibility for evaluating the solution remains with the human. The interactive process between human and computer is critical to their communion. In Sheridan’s supervisory control model, interaction is between the computer in the HIS (Human Interactive Subsystem) and the human supervisor. The doubled-headed arc in Figure 22 indicates the communicative link between the human and the computer. The human can send signals to the computer that control its decision-making process and the form of the presentation.31 Through the screen presentation, information is passed to the human. The form of the display affects the human’s decision-making process, as was discussed in Section 4.3.1. It has a “controlling” influence on the human.

31

Humans “present” the signals to the computer through keystrokes or mouse moves.

110

The interactive scheduling process can be analysed using the MHS. Scheduling activity begins when the HIPSS is alerted to a change in the state of the environment. The alert may come from the manufacturing environment (e.g., breakdown or telephone call) or from the computer system (e.g., data on new arrivals). Human schedulers may be so skilled that they know immediately from their observation of the data which jobs they will allocate to which machines and the order the jobs shall be processed.32 Cases of such skill-based reasoning are not obvious in scheduling. Their occurrence is domain and person specific and therefore difficult to discuss in the abstract. The common response to an alert is the seeking of patterns in data that will trigger rule-based responses: production rules 2 and 3. Information processing activities

States of knowledge resulting from information processing Evaluate performance criteria

The shaded region in Figure 29 shows rule-based reasoning in an HIPSS. There are two stages in the process. First, in observing cue-patterns in the data the scheduler sees a particular scheduling policy as suitable.33 A scheduler, well versed in the manual execution of the policy, would recall the procedural steps from LTM (production rule 7a). Otherwise, the scheduler would have to figure out the

10

What are we aiming for?

Knowledge-based domain

6

Uncertain

Performance goal

5

11

What are the consequences, given goal?

Determine state consequences

5

9

9a KNOWLEDGE BASED ANALYSIS

State 8

4

8c

Choose criterion

12

8a 8b

Criterion

What policy satisfies criterion?

13 7c

Define policy: operationalise criterion

14

Data

7

What is happening?

2 Alert

7b

Computer Support

7a

3 Observe information and data

KNOWLEDGE BASED PLANNING

9c

9b

Identify state

What do these data signify?

consequences

State consequences

8d

What is the effect on this state?

Rule-based domain

Policy

15

How to carry it out? Determine steps: operationalise policy

16 Steps

1

17 Carry out

Activate

Skill-based domain

Figure 29. Computer support for rule-based decisions. 32

This tacit level of decision making is not a conscious deliberative act. Skillbased behaviour is a sensory-motor performance that takes place without conscious control, such as exhibited in experimental tracking tasks, and bicycle riding. 33

The distribution of values for a specific job attribute is a sign that activates a particular policy.

111

steps before carrying them out. During the execution of the steps (production rule 17), the scheduler searches key attributes among the data. For an EDD policy the scheduler searches all available jobs for the one that has the earliest due date. This is placed at the head of the queue. He/she then searches the remaining jobs for the one that has the earliest due date. It is then placed next in the queue. The scheduler repeats the procedure for the remaining jobs. During the execution of the policy, the information extracted by the scheduler from the data is different to the information used to select a suitable policy. Consequently, the form of the presentation that aids decision making may be different for these two activities. However, as the decisions are procedural for both activities, the scheduler senses the information in the display as signs. In an HIPSS, the procedural steps for some policies may be automated. Nevertheless, schedulers should be able to observe from the computer’s presentation that the resulting sequence complies with the procedural steps for the policy.34 Then they could be assured that the activities of the computer conform to their understanding of the chosen scheduling policy.35 A greater challenge for the scheduler is to make decisions in circumstances where no known heuristics seem to apply. With no policies clearly pertinent, the scheduler has to try to draw some meaning from the data (production rules 7c and 4). For the data to turn into meaningful information, the scheduler must find signs in the data that stimulate scheduling activity (Morris, 1946). Signs emerge when there is some mapping between the presentation and the behavioural response. Their emergence depends on the presentational form and the observer’s cognition. By changing the presentation — by modifying, adapting or transforming the form

34

While the outcome must match the expert scheduler’s expectation, it is not necessary for the computer’s and the person’s algorithmic procedure to correspond. 35

Where the procedural steps are transparent the human can understand how the computer produced the schedule and therefore he/she is more likely to accept the computer’s solution. The question of “trust” is discussed in a later section.

112

in which data is presented — a mapping may emerge. Conversely, the scheduler could try out different mental models of the scheduling process on the data until a mapping arises. The interactive process between representational form and the scheduler’s mental representation36 may indeed be iterative. On drawing a hypothesis, the scheduler tests it through manipulation of the presentation. This, in turn, leads the scheduler to reappraise his/her representation. Drawing another hypothesis, the scheduler then tests it. If the scheduler finds a mapping signifying the efficacy of a known policy, then the scheduler enacts the policy through production rules 8a or 8b. If a scheduling policy is still indeterminate, schedulers may have to contemplate what criterion they are trying to satisfy (e.g., minimise average tardiness), and then define a heuristic that meets it (production rules 8c and 13). Where schedulers do not know what performance criterion to follow, decision-making moves to the higher knowledge-based level of organisational goals as discussed in Chapter 3. They trial various functional performance criteria until they find one that suitably matches the performance goals. For both situations, schedulers draw on information from the environment as well as that stored in the computer: patterns in current and historical data, their experiential knowledge and expert sources (books, OR consultants, etc.). As schedulers’ problem-solving activities become more knowledge based, they move from cue-patterns to the behaviour of the manufacturing system in processing jobs. Instead of using the display to seek patterns in the value of one or more attributes, they would want the display to be a “window” on the physical world. This would require some form of simulation of

36

Mental representation here refers to the model and not to cognitive processes. The behaviour of a thermostat in controlling a refrigerator is an example where the physical form is commonly different to the user’s mental representation. In their representation of the thermostat, users often think that a refrigerator will cool down quicker if the thermostat is set to a lower temperature than that desired. They think that there is a relationship between cooling rate and temperature setting. In reality, the thermostat is an on/off switch that remains on until the set temperature is reached. As it merely switches the refrigerator’s compressor on and off, it does not control the cooling rate.

113

the manufacturing process. They would treat the information displayed by the computer as symbols. The set of attributes for a job acts as a symbol for the real physical object. Therefore, the surface features of the display would have to be such that they support symbolic interpretation. The supervisory modes obviously underpin the recognition-action cycles. For example, an alert arises during the monitoring of the manufacturing system. It may arise when information regarding new jobs appears in the computer, or perhaps, when a person observes an event in the manufacturing system (e.g., breakdowns and a job’s actual completion time being at variance with the estimate). The learning mode, however, requires some clarification. The content of memory is not explicitly shown in the MHS. Humans learn from their past behaviour. For example, in enacting a scheduling policy, humans unfamiliar with a policy first have to determine its procedural steps (production rules 16) before carrying them out (production rules 17). With experience, they can recall the steps from memory and production rule 16 is not triggered. If the computer could also learn from a person’s execution of production rule 16, then the procedural steps (production rule 17) could be automated. In summary, the MHS can represent the decision-making processes of an HIPSS using recognition-action cycles. The human and computer, acting separately or together, execute the various production rules. The presentation and control of information between the computer and human in the Human Interactive Subsystem vary with the recognition-action cycle. At the rule-based level, the scheduler seeks dominant patterns in the data that match the conditions for a particular scheduling policy. At the knowledge-based level the scheduler tries to make inferences using information presented by the computer and extrinsic information not held in the computer. The underlying data that the computer shows may not change. Its significance (the informational content) changes, as scheduling activity progresses from the alert, to skill-based, rule-based and knowledge-based activity. As the information presented to the scheduler changes, the representational form for its communication may also need to change.

114

4.3.1 Information through Patterns in Data Humans and computers have to find some structure in the data on which to draw inferences (Tukey, 1977; Sibley, 1988; Bertin, 1981). They search the available information, including the attribute values of jobs and machines for patterns, to uncover appropriate structure. At the knowledge-based level, either the human or computer seeks a dominant pattern on which to propose a hypothesis. The inferential process is inductive. In contrast, at the level of rule-based decisions, either the human or computer recognises a pattern in the data that maps to a known rule (e.g., a scheduling policy). Patterns in data can be recognised in various ways (Solso, 1979): 1. Gestalt Psychology: A pattern is found by observing the entire configuration as a whole. The spatial relationship of dots is illustrative. Eight dots can be Features in context Features without context arranged various ways: ••••••••, Figure 30. Facial features recognisable in the •••• ••••, •• •• •• ••, •••• ••• •. context of the whole face (based on Solso, While the dots are individually 1979). indistinguishable, different subsets are visible when they are taken as a whole. In the scheduling domain, a scheduler may perceive a pattern among the values of a job attribute. For instance, when a scheduler compares the due dates for all jobs, he/she may see a pattern in their distribution. 2. Bottom-up processing: Patterns in higher level information are revealed through aggregation of lower level information (e.g., identifying a dog by first recognising its coat, its four legs, etc.). 3. Top-down processing: Recognition of the whole leads to recognition of the components. For example, from a minimalist sketch of a face, a person can recognise features, such as an eye, that would not be recognised alone (Figure 30). “World knowledge” facilitates identification in appropriate contexts (e.g., stethoscope in a physician’s office).

115

4. Bottom-up and top-down processing: The interpretation of parts and wholes takes place simultaneously. By increasing the detail in the features in Figure 30, at some point a person will recognise them out of context (i.e., the person applies bottom-up processing). This suggests bottom-up processing accompanying top-down processing for those components that can be directly identified. 5. Template matching: The pattern in the data exactly matches a template that satisfies a known condition. For example, in entering a computer password a person has to exactly match the stored password. 6. Prototype recognition: Recognition occurs when there is a clear, but not necessarily an exact, match between perceived pattern and an abstracted or idealised pattern. 7. Feature analysis: Pattern recognition is a two-stage process. Incoming stimuli are first analysed according to their simple features. Recognition is obtained through searching a memory store (databases for computers and LTM for humans) for features. Matching data to a template is easy for a computer. However, this method fails when there is the slightest mismatch: a word written in lower case does not match a template that has the word in upper case. Prototype recognition allows an approximate match. While a person can easily recognise a word written in different scripts by different hands, sophisticated algorithmic procedures are required for computer recognition. The other methods for recognising patterns are difficult, if not intractable, for a computer to apply. They are incorporated to a varying degree into the various theoretical models of human cognition. The process of recognition requires the computer or person to associate the data with a stored representation. Under rule-based decision-making, the stored representation may be just a simple scheduling policy that acts as a prototype. The scheduler recognises the next job to schedule under EDD as the one having the earliest due date. Schedulers cannot restrict their behaviour to simple cases of recognition. Even recognising that the EDD rule applies is not simple. The suitability of one rule over others is dependent upon the pattern in the data. A particular rule may suit jobs that have widely spread due dates and processing times close together, but may be unsuitable if the due dates were close together

116

and processing times were widely spread. If the pattern of due dates and processing times falls between these extremes, then the scheduler may find it quite difficult to recognise the appropriate policy. It is surmised that recognition may be even more difficult in situations where broader information has to be duly regarded (e.g., average tardiness for the current schedule, customer importance, machine or operator availability). A scheduler’s mental representation of scheduling depends upon the environmental context. Consider, for example, a scheduler whose experience has been obtained within an environment in which the processing times for all the jobs are similar and the due dates are widely spread. He/she will have a different mental representation of the scheduling problem to a scheduler who has always operated in an environment with the converse conditions: similar due dates and widely-spread processing times. At the time a schedule is constructed, the jobs do not have a physical presence. Schedulers only have signs before them. Alphanumeric characters on paper, for example, signify the physical properties of the part to be manufactured. Schedulers’ disposition to respond to the signs depends upon their experiential knowledge. A scheduler may observe a pattern in the values of a particular attribute across jobs. This pattern may be a sign that triggers the recognition of an appropriate policy. The recognition of a pattern depends upon the form of representation: the form of the symbols (e.g., numeric or graphic), their size and spacing. Graphical symbols allow users to make perceptual inferences based on physical features (shape, size, spatial location etc.). It is easier to make such inferences than it is to make logical inferences solely in memory using mental arithmetic or numeric comparison (Casner, 1991; Larkin and Simon, 1987).37 Data can be given graphical form through the application of a metaphor.

37

Applying logical task-descriptions, Larkin and Simon (1987) present a cogent theoretical argument to substantiate this claim.

117

1 3 2 The problem of constructing a schedule that minimises the makespan of equal parallel machines D is analogous to the packing of bins (Coffman, F B Garey and Johnson, 1978). Job processing times A E are equated to heights of boxes and the depth of the C bins is the scheduling time horizon. Schedulers Figure 31. Parallel manipulate images of bins and boxes in their mind. machine, makespan The considerable cognitive load this generates problem displayed as a multiple bin, minimum would reduce significantly if the scheduler could height problem. compare and manipulate bins and boxes on the screen. To build a schedule, a scheduler would add boxes to the bins. Makespan is a minimum when the highest pile of boxes is a minimum. In arranging the boxes, the scheduler compares the relative heights of the piles and the size of boxes in each bin and those not yet in a bin. To decide what move to make, the scheduler uses spatial reasoning. For example, consider the next step for the situation shown in Figure 31, in which all the boxes are already in bins. The scheduler can see that the pile of boxes in bin 2 is the highest. To reduce the height of the pile, the scheduler could move either box C or D. If D is moved to bin 1 or 2, the height in those bins would then exceed the current height in bin 2. However, if box C is moved to bin 1, the height in bin 1 is less than the new height in bin 2. Bin 3 is now the highest, and as it is less than the height previously in bin 2, there is an improvement in performance (i.e., the makespan is less).

Presenting the data within a bin-packing metaphor helps schedulers to recognise patterns in the data and to decide upon actions that would reduce the makespan. The computer presents information to users in a graphical form that makes it easy for them to solve the problem analogically (Eberts and Eberts, 1989). The metaphor guides a scheduler towards solution methods that fit the analogy. If the metaphor is not already part of the users’ problem-solving repertoire, the explicit representation provides an opportunity for them to internalise the analogical model. As the bin-packing metaphor is specific to the objective, it may be inappropriate for other performance objectives: the minimisation of the mean tardiness for instance.

118

4.3.2 Representation in the Mind The acquisition of information goes beyond the physiological operations of sensation, attention, perception, and pattern recognition that are discussed above. Alone, these explain neither how bits of information aggregate to influence choice, nor how decision-makers process, formulate, acquire, analyse or interpret information (Sage 1987). Schedulers have to recognise how their observations fit the problem formulation. This requires them to match what they observe to some representation of the problem in their mind.38 Once they have recognised the state of the system, they can then act. The simplest form for describing a scheduler’s action sequence is an IF-THEN production: if the pattern fits a template a rule then fires. The rule may consist of elaborate scripts of action that fit a stereotyped knowledge structure (Schank and Abelson, 1977). The classic example is the “restaurant script”. When persons enter a room in which the physical cues match a restaurant then scripted behaviour for a restaurant is invoked. As the sequence of events unfolds there is a scripted response for each cue that arises. On arrival, customers wait for a waiter to show them to a table. When the waiter hands them a menu, they know, without being told that they are to select their choice for each course. Persons working in a manufacturing system have some conceptual understanding of how the system functions. They come to expect the system to respond in a particular way when they interact with it. While a person’s mental representation of the functions and objects in the system may not necessarily truly reflect the actual form, it still may guide his/her actions towards the meeting of the chosen goals (Norman, 1986). Rasmussen refers to the mental representation of the dynamic behaviour of the physical system as a mental model (Rasmussen, 1990). He uses the term mental model to characterise those features of a person’s knowledge base that represent properties of the task environment — the structural configurations of elements and their functional relationships — that can serve the

38

Some simple forms that researchers have used to describe representations in the mind are templates, prototypes and collections of feature-analysis primitives.

119

planning of activities and the control of acts (Rasmussen, 1986). A mental model is, however, a simplified abstraction of the real world that is sufficiently detailed to help a person interpret signals from the environment, and to work out appropriate actions (Rasmussen, 1979; Green, 1990; Moray, 1997, 1999). The mental model bridges the work environment that a person wants to control and the mental processes underlying this control. For Rasmussen, knowledge in the form of procedural steps is not part of a mental model. For example, he would not call the restaurant script discussed above a mental model.39 From his standpoint, mental models do not operate at the rule-based level, as behaviour at this level is merely that of empirical mappings of cue-patterns. Such mappings do not rely on functional representations of the system. Therefore, he restricts the meaning of mental model to those activities associated with knowledge-based reasoning, and therefore it does not apply to learnt rules or well-practised skills.40

39

Rasmussen’s framework helps resolve the question posed by Wilson and Rutherford (1989) of the difference between schemata, and the like, and mental models. They see schemata as somewhat permanent structures held in memory that are available for use by a mental model. Mental models are seen as creations of the moment. They are temporary data structures built during understanding. They are the utilisation of these other representations in a dynamic situation. The restaurant script provides a framework on which to hang the current scenario: the actual details for each restaurant “enactment” are different. From the perspective of human factors, or cognitive engineering, these representations are at the level of rule-based behaviour. Cues from the environment trigger appropriate procedural steps. 40

In their study of Charlie, Dutton and Starbuck (1971) reported that he used a mental look-up table of speeds for particular situations that were distilled from a long series of unique experiences. They stated that he talked as if the existence of a computation procedure was a novel idea. Having not perceived the underlying functional relations, he operated at the level of rules and reasoning was never raised to the knowledge-based level.

120

The systems-thinking approach to supervisory control is pragmatic. For systems design purposes it is not necessary to have detailed models of the actual mental processes engaged by operators, but rather higher-level structural models of the activities they can use (Rasmussen, 1986). They are mechanisms whereby humans can generate descriptions of system purpose and form, explanations of system functioning and observed system states, and predictions of future system states (Rouse and Morris, 1986). A person’s mental model is based on his/her expectations and experience and on his/her current perception of the system. It provides a basis of his/her understanding (Norman, 1983, 1986; Young, 1983). Wilson and Rutherford (1989) state that there is confusion between the psychological and human factors notions of mental models. Simply put, in psychology a mental model is a description of a person’s mental processes, whereas in human factors it is the product of such processes. Another source of confusion, at least with Rasmussen’s account of mental models, is the claim that these are not psychological models but functional models. Wilson and Rutherford find this claim quite perplexing as most psychological models embrace a functionalist philosophy. However, Rasmussen is referring to the attributes of the physical world: a mental model of a fluid dynamic system cannot flout the principle of continuity of mass flow. The model is a functional explanation of the physical system’s behaviour, depicting the relational structure of the environment (Craik, 1943).41 Examples are the speed range of a machine, the maximum number of colours that a printing press may apply, and job-machine relations such as processing time. This is quite different to a functionalist explanation of human behaviour. The form of representation may affect the reasoning process. In creating schedules, schedulers tend to invoke mental images of jobs being processed. Experimental evidence supports the proposition that generally people use mental

41

Johnson-Laird (1983) differentiates between physical and conceptual mental models. Physical mental models represent the perceived world in a fixed direct fashion. In contrast, conceptual models represent more abstract entities and relations.

121

imagery42 to solve problems (Huttenlocher, 1968). Where subjects are asked to compare the size of a few objects, those persons who use mental imagery perform better than those who do not (Bower, 1972; Moyer, 1973). Images that form depend upon expertise. An expert’s mental image may be within a specific theoretical framework. In analysing the dynamics of a car’s suspension system, a mechanical engineer may not see a car in his/her head. Instead, the picture may be a schematic model of idealised springs, masses and dashpots. To observe dynamic interactions, he/she runs a mental simulation. The simulation may operate under conditions that do not occur in practice. For example, to observe how particular spring-mass-dashpot subassemblies interact, the engineer may hold other subassemblies rigid. In their endeavour to draw out relationships between jobs, schedulers may manipulate and compare representational images.43 Sometimes they resort to constructing graphical symbols to represent signs in a form that helps perceptual inferences. For example, in comparing jobs with different values of a particular attribute, they may represent the different values as lines sketched on paper, for

42

How the information forming the mental model is codified in the mind is open to conjecture. Solso (1979) presents the three competing theoretical postulates that account for the cognitive processes. The radical-imagery hypothesis holds that subjects convert visual and verbal material to images that are then stored in memory. The dual-coding hypothesis suggests that there are two coding and storage systems, verbal and imaginal, and information may be coded and stored in either or both. The conceptual-propositional hypothesis supposes that visual and verbal information may be represented in the form of abstract propositions about objects and their relationships. Depending upon the theoretical perspective, mental imagery may be taken as a pragmatic mnemonic, a component of propositional representation or merely as subjective epiphenomena: merely secondary phenomena that do not causally participate in reasoning or problem solving. Wilson and Rutherford (1989) provide an excellent summary of the debate. 43

While not conclusive, this is supported by the field study discussed in the following chapter.

122

instance, processing times shown on a Gantt chart. The degree of mental imagery is currently at the level of conjecture. Nevertheless, it is a reasonable presumption to say signs (usually displayed as alphanumeric characters) pertaining to job attributes have some effect on the representational images produced in the scheduler’s mind.

4.3.3 Mental Representation and the Environment For knowledge-based behaviour, a person’s mental model cannot just be a simulation of surface features. In making inferences and predictions under novel circumstances, a person has to apprehend the underlying relational structure of the physical process (Johnson-Laird, 1983). Relational structures may be at the level of commonsense understanding or in the form of specialised knowledge of physical behaviour. Edlund and Lewis (1994) use as an example an aeronautical engineer’s mental model of wing behaviour to discuss the difference between these two forms of understanding. In the engineer’s mental model, the behaviour of air flowing over the wing is constrained by both a commonsense understanding of the fluid flow and specialised engineering knowledge of fluids and airfoil design. The more specialised engineering knowledge, expressed as heuristics and equations learnt at university, are the instructions that further constrain the commonsense model. Edlund and Lewis use the concept of attunement,44 from Situation Theory (Barwise & Perry, 1983; Devlin, 1991), to describe commonsense

44

While Situation theorists also argue that the acquisition of information from a situation depends upon those constraints to which the agent is attuned, they use the noun ‘attunement’ as a quality of a constraint associated with the situation. Which information that an agent is attuned depends upon the form the constraint. For example, the relationship between the number of rings in a tree trunk and the age of a tree is the nomic constraint (attunement) that allows a forest ranger to be attuned to the age of a tree (Devlin, 1991).

123

understanding.45 A person is seen to be attuned to the constraints in the situation. Many constraints, especially those of natural laws (nomic), exert a pervasive influence on a person’s experience. The effect of an object’s weight and the impossibility of having two objects occupying the same space simultaneously are examples of experiences that are integral to a person’s life. For nomic constraints, the cognitive processes for modelling the dynamic behaviour of systems are automatic and therefore not subject to mental processing limitations or introspection (Edlund and Lewis, 1994). Nomic constraints elicit attention automatically, and are extremely resistant to extinction. Within Situation Theory, those aspects of a mental model that cannot be automatically processed are in the form of instructional rules similar to production rules found in traditional models (Edlund, Weise and Lewis, 1995). The type of situation determines the constraints to which a human agent is attuned. The situation depends upon the type of objects, their relations and their constraints (Edlund, Weise and Lewis, 1995). In contrast, instructional rules are situation independent, effortful, and consciously used to guide, restrict and interpret the behaviour of the model. Therefore, mapping a situation that requires extensive explicit processing to a situation where the same constraints are expressed as attunements can enhance human problem solving efficacy. By expressing constraints associated with scheduling rules as attunements, cognitive

45

Situation Theory, Situated Action, Activity Theory and Action Theory are variants of a functional psychology that relates perception and action. Their foci are different: logic and linguistics are the subject matter of Situation Theory; Situated Action is directed towards psychology, sociology, and education; the focus of Activity Theory is the individual’s consciousness in shared practical activity; Action Theory concentrates on tasks and action design at the work place. Action Theory has a structure that fits well with Rasmussen’s SRK. Through an inner regulatory process, a person moves in small steps by seeking subgoals that move the problem forward towards the final goal. To obtain an overview of these different theoretical perspectives see Norman (1991), Greif (1991) and Bødker (1991).

124

load on the scheduler may be reduced.46 By using bin packing as a metaphor for the makespan problem, a scheduler attunes to nomic constraints — those of solid objects regarding spatial location — that would otherwise not be present in the problem.

4.4

Supporting Knowledge-Based Behaviour

Schedulers who are operating at the level of knowledge-based reasoning consider possible outcomes by mentally modelling the manufacturing environment. If they could use attunements instead of instructional rules in constraining the dynamic behaviour of the model, then they may find the modelling process easier. An HIPSS that presents information to schedulers in a way that they can directly perceive the attunements, would reduce the need for any mediating inferential process. A system that substitutes visibility for storage and replacing mental operations with visual inferences, would therefore effectively support its users (Lewis, 1997). The attunements act as external memory.47 By having the computer display boxes and bins in the makespan problem, the scheduler becomes directly attuned to the spatial constraints that map to scheduling criteria. For direct attunement, the scheduler has to be able to perceive functional relationships from observing surface features of the objects on the screen. What

46

Lewis (1991) uses two-dimensional blocks as a cover story for flowshop scheduling. As a visualisation, it is realisable because it does not violate nomic constraints or their attunements. He shows that for job shops, that time and processing duration can be rewritten as spatial location and extension but a spatial relation cannot be used to envision the constraint that only a single operation for a job can be processed at a time. This partial visualisation produces the familiar Gantt chart. 47

Hutchins (1995) describes how in modern aircraft the cognitive burden on memory for intended and current speeds is allocated to external memory structures within the cockpit.

125

features support direct attunement? The concept of affordance from Gibson’s (1966, 1979) ecological physics offers a theoretical underpinning for designing suitable screen objects. The environment is described in terms of opportunities for action it presents the actor (Kirlik, 1995). An affordance is the relationship between properties of the environment and properties associated with the organism’s capabilities for action. It specifies the match between the environmental structure and the functional abilities of the actor (Kirlik, Miller and Jagacinski, 1993). Affordances are functional properties of an object and its surfaces taken with reference to the actor (Gibson, 1977). While attunement is something an organism actively does to pick up information from the environment, affordances are properties of the environment, defined with respect to the action capabilities of an organism (Vicente, 1997). They are the possibilities for action in the environment.48 Although affordances are relational properties, once the appropriate relations are determined, the actor-side of the relation can be left implicit (Kirlik, Miller and Jagacinski, 1993). For example, the relation between the shape and size of an actor’s body and the dimensional properties of an object determines whether it affords sitting. However, once the actor’s body has been taken into account, objects can be described using sitting affordances.49 When a perceived property matches the actual property of a thing, then the user

48

Perceivers learn to attune themselves to increasingly higher-order invariants and can make use of ever more subtle affordances to satisfy their needs (Kirakowski, 1997). 49

In looking around your kitchen, you will see objects that afford pouring: jugs, milk cartons, saucepans. When you consider whether an object affords pouring, you usually do not refer to who would do the pouring. No explicit reference is needed as these objects fit the “human scale”. Nevertheless, affordance relates to the user’s action. Information specifying an affordance — its dimensions, its mobility, etc. — must not be confused with the affordance itself: an accusation that Flach (1995) levels at Norman. While the object may meet specific physical criteria, it still may not afford pouring, for example, by a small child or a person with arthritic hands. A person’s perception of affordances indeed varies as he/she learns and as his/her bodily dimensions and capabilities change (Neisser, 1993).

126

knows what to do just by looking (Norman, 1988). On seeing a chair, a person knows that it affords sitting. Designers communicate the intentional purposes (i.e., the designer’s model of its use) of their designs through affordances. A handle on a door affords grasping and pulling, whereas a flat plate on a door affords pushing (Norman, 1988). Computer interfaces are designed objects as well, so their affordances must also be carefully designed (Gaver, 1991). On-screen buttons afford clicking, but do not afford moving or modifying (Mohnkern, 1997). In addition, the degree of attunement to a particular affordance depends on the informational clutter. If a particular object on the screen forms but a small part of what the person actively sees, then its effect on his/her mental state would likely to be quite small, unless the object has special features to attract attention (e.g., blinking) or the user has incorporated attending it into his/her behaviour because it is perceived to be especially important (e.g., a critical alarm). The computer display in an HIPSS is an intermediary between the user and the problem domain behind it. If the interface effectively maps the information and the controls necessary for the scheduler to interact with the problem domain, then the internal situations in a person’s mental model will be systematically linked to the external situations that he/she perceives (Devlin, 1991). The user’s focus can be on the scheduling problem itself. That is, the scheduler feels as if he/she is acting on the domain itself, rather than on the interface. In Norman’s (1991) terminology, the scheduler feels engaged in scheduling activity, and not engaged in managing a computer. In the bin-packing metaphor for the makespan problem, the boxes afford stacking in a bin. However, not all affordances that may be inferred by the metaphor are present. If a box shown on the screen does not fall when it is not supported on its underside, then it does not behave like a real box. The representation does not support the weight property of the metaphor. By reducing the screen representation to two-dimensions, an observer may perceive that the boxes and bins are weightless. This fits with commonsense understanding (nomic attunement) of two-dimensional sketches (e.g., in using house plans to position cardboard cut-outs of furniture, people are generally unperturbed by the lack of representation of the furniture’s weight). Even this reduced form does not 127

completely map to the problem domain; the width of a box is not associated with any attribute of the job. Thus, for the metaphor to act as an adequate model, a person manipulating the boxes on the screen must be restricted to vertical stacking. To support stacking behaviour, the width of all boxes must be such that no two boxes can be placed side by side. Objects may also have properties that are not mapped to features in the metaphor (Mohnkern, 1997). The bin-packing metaphor is a highly restricted representation of the scheduling domain; it only supports a single attribute of a job, that is, its processing time. In an HIPSS, the scheduler should be able to effectively interact with the domain behind the interface. The scheduler needs to get beyond the syntax of the interface to the semantics, to what the interface relates to (Kirlik, 1992). A well-designed, intuitive, interface may poorly support representation of the problem domain. While users may be able to see how to place boxes in bins, the information and actions that are available at the interface may not allow them to perceive the demands and opportunities in the problem domain. To get beyond the surface level and consider the scheduling problem at a deep level, schedulers must rely upon their mental model of the underlying scheduling domain. However, persons tend to forget functional properties of the problem domain that are not mapped to perceptual features of the display (Hollnagel, 1981). The problem with the use of metaphor is that as decision-makers become more experienced, they focus on increasingly superficial aspects of the problem (Wood, Shotter and Godden, 1974). If schedulers become too reliant on metaphorical representations, their experience of the environment becomes abstract and indirect (Hancock and Chignell, 1995). Therefore, their perception of the problem results in actions that may be discordant with the consequential actions on the shopfloor. By perceiving scheduling as the packing of a bin, for instance, schedulers may come to see the problem only in terms of the intermediary. They may not relate the stacking actions to the resultant ordering of job manufacture.

128

4.5

Mapping Surface Features to Situational Constraints

What features make a computer display an effective interface for hybrid humancomputer decision-making in scheduling? Applying the decision-making framework of the MHS to the hybrid scheduler, interaction occurs between human and computer during both the recognition and action phases. During a recognition-action cycle, the scheduler may use the interface to obtain relevant information and to instigate appropriate action. The recognition-actions cycles occur at the various levels of skill-, rule- and knowledge-based reasoning. Whether a scheduler at a particular time is attuned to signals, signs or symbols in the display depends upon the type of reasoning currently applied. Before a scheduler can effortlessly extract information, it has to be in a form that befits the decision level. By mapping higher-order functional information to features in the display, schedulers can use their perceptual capabilities. Therefore, the aim is to map goal-relevant constraints from the work domain onto salient perceptual properties of the display. Approaches that use the mapping principle are Vicente and Rasmussen’s (1990, 1992) Ecological Interface Design (EID), Woods’s (1991) representation aiding, and Bennett and Flach’s (1992) configural displays.

129

4.5.1 Application of Mapping to Supervisory Control of Continuous Processes

Figure 32. Organisation of a composite symbolic display for an industrial plant. (Rasmussen, Pejtersen and Goodstein, 1994, adapted from Lindsay and Staffon, 1988).

There has been interest in applying the mapping principle to supervisory control of nuclear power plants (Abbott, 1982; Beltracchi, 1987; Lindsay and Staffon, 1988; Rasmussen, Pejtersen and Goodstein, 1994; Hansen, 1995; Vicente et al., 1996). The display presented by Rasmussen, Pejtersen and Goodstein, shown in Figure 32, is particularly illustrative of an interface in which users can perceive information at multiple levels of abstraction. How different forms of presentation can coexist in an interface are easily observed in this example. To obtain an understanding of how an interface can afford perception at different levels of

130

abstraction, some key features of the display will be described.50 The top lefthand corner of the display concerns the heat transfer between the primary coolant circulation and the secondary coolant circulation. This part of the display is shown in Figure 33. Graphical patterns show the circulation paths of primary and secondary coolant. The background boxes represent the heat producing reactor and the heat exchanger. The polygons overlaying these boxes represent the cooling circuits. The direction of fluid flow in the primary and secondary coolant circuits are, respectively, clockwise and anticlockwise. Temperature TP3 reactor

Heat exchanger

TS2

Super heater

TP2 Steam generator

TP1

TS1

TP4

Figure 33. The coolant circulation in the primary and secondary circuits.

Temperature sensors drive the vertical position of the corners of the polygons. The vertical position of a corner signals the temperature, which the operator can read using the vertical temperature scale. By locating the corners relative to the boxes representing the components, the operator can recognise the part of the circuit being considered. For example, the temperature of the primary coolant entering the heat exchanger is TP3. The display allows the supervisor to operate at the lowest level in the means-ends abstraction hierarchy, the reading of instruments. While primary sensor data generates the display, the shape of the graphical patterns supports perception at higher levels of abstraction. In this example, the higher-level constructs are heat transfer and the operational state of the entire coolant circuit. The relative size and position of the polygons allow skilled

50

The mapping principle applied to supervisory control of nuclear power plants is comprehensive described in the above sources.

131

operators to perceive, at a glance, the status of these abstractions. From the juxtaposition of the two polygons in Figure 33, an experienced observer can recognise that the behaviour of the heat exchanger is operating normally: heat flows from the primary to the secondary system. An observer who has thermodynamic knowledge may perceive the display as symbols. The vertical edges symbolically represent temperature difference across the heat exchanger: temperature drop for the primary circuit and temperature gain for the secondary. Critically, TP3 being higher than TS2 signals that the heat transfers from the primary to the secondary circuit. The bottom corners, TP4 and TS1, being nearly level indicate that all available energy has been transferred. TP4 is slightly higher than TS1, as a threshold temperature difference is required for heat to start transferring. The diagram may also act as a sign. An experienced operator may recognise that the relationship between temperatures suggests normality without understanding the thermodynamic principles of heat transfer. The display allows the operator to compare temperatures from spatial displacement. By using physical distance as a sign for temperature difference, the operator avoids the cognitive demands of mental arithmetic. The spatial pattern is a sign that the behaviour is normal. Indeed, operators may become so conversant with the display that they could ignore the functional properties of the process being controlled. In assessing the state of the system they then would not go beyond perception of the pattern in the surface features (Hollnagel, 1981). Temperature

TP3 reactor

Temperature

TP3

TS2

Heat exchanger

TS2 reactor

Super heater

Heat exchanger

Steam generator

Super heater

Steam generator

(b)

(a)

Figure 34. The propagation of a disturbance through a thermodynamic system (based on Rasmussen, Pejtersen and Goodstein, 1994).

132

The use of perceptual inference by operators is even clearer when it comes to monitoring the response of the system to a disturbance. Figure 34 (a) shows the state of the system immediately after a disturbance. The primary coolant flowrate has been suddenly increased. The polygon representing the primary circuit has shrunk to a smaller size than that shown in Figure 33. The smaller size indicates that the coolant at a lower temperature than previously is transporting the power from the reactor. In the heat exchanger, the vertical edge of the primary circuit is smaller than the secondary circuit. The easily perceived difference in height between the juxtaposed polygons acts as a clear sign that the system’s behaviour is aberrant; the power is flowing from the secondary to the primary circuit. An experienced operator is therefore aware that a transient is dominating the system’s dynamic behaviour. This awareness may be at a surface level. From the visual pattern they recognise the state of the system (i.e., their behaviour is rule-based). Alternatively, if they have a sufficiently deep understanding of thermodynamics they could analyse the state of the system through considering the flow of heat energy across the exchanger (i.e., knowledge-based behaviour). Operators watching the display can see the primary-circuit polygon enlarging until the corner TP3 is higher than TS2 as shown in Figure 34 (b). This indicates that the normal direction of power flow has been re-established. In addition, as the bottom corners of the two polygons are not closely aligned, operators could readily surmise that the response of the system is not yet steady. The above discussion highlights an important difference between conventional displays and those that apply the mapping principle. Primary data can be observed with both types of display. Operators using the “mapped” display can also observe the dynamic behaviour of the system as they follow the flow of heat. In contrast, where the display is conventional, for example, a bank of analogue meters showing the process temperatures, operators must calculate the heat flow from the primary data and then deduce operational behaviour. In an ecological interface, the functional and intentional constraints — the sources of regularity of the system — are mapped to surface features (Rasmussen, Pejtersen and Goodstein, 1994). For the coolant-circuit display, the principle of heat flow is the functional invariant; it constrains the values of temperature in the system. Heat from the coolant circuits powers a heat engine. In the engine, steam

133

drives a turbine within the operational bounds of a Rankine cycle (Beltracchi, 1987). Control-room operators monitor the process and intervene where necessary to keep the system operating within the intentional constraints of the Rankine cycle.51 In monitoring the process, the operators are attuned to these constraints. The part of the display that relates to the Rankine cycle is shown in Figure 35. Its form is based on a conventional temperature-entropy (T-s) diagram.52 Temperature Super heater

Water-steam phase boundary

turbine

Steam generator

Drum pressure

condenser

Figure 35. Constraints on the thermodynamic cycle.

Temperature and pressure, which are measured by sensors, drive the position of each corner of the polygon in Figure 35. These primary data can be read using the vertical scales. As each corner’s position is generated by dynamic data, the polygon is animated; its shape changes as plant status varies. The invariants constraining the temperature of the process are the conservation of energy and the water-steam phase relationships. As the fluid passes around the circuit it changes from liquid (water) to a mixture of liquid and gas (saturated steam) and then to gas (superheated steam). These functional properties map onto salient perceptual features of the display. Lines to the left of the phase boundary indicate that the fluid is in the form of water. Lines to the right indicate superheated steam. The

51

The intentional constraints are the bounds on the value range that the designer intended the system to operate. 52

The abscissa represents the entropy.

134

line enclosed by the boundary indicates saturated steam. As water changes to steam, it passes through a state in which liquid and gaseous phases are mixed. This process takes place at constant pressure and temperature; the scale on the right allows the pressure to be read. In Figure 35 the lines within the closed region of the steam generator indicate that the water is heated until it becomes saturated. It passes out of the box just before it becomes superheated. However, if the horizontal line moves farther up the diagram, then part of the line in the superheated region would be contained in the steam-generator box. This would signify abnormal behaviour (Beltracchi, 1987). The Rankine-cycle display, therefore, allows multiple levels of cognitive control. Operators can easily read the values of the process sensors. They can assess the performance of critical aspects of the system state. For the steam generator, the goal is for the steam to exit on the verge of being superheated. The operators can clearly see how close to this intentional goal the system is operating. In monitoring the performance of the steam generator, they may operate at either rule- or knowledge-based levels of cognition. If superheated steam is present in the steam generator, the operator’s goal is to eliminate all superheated steam. The rule the operator applies is, “decrease the temperature of the steam by decreasing the heat input, until the superheated component is removed.” If the operator ignores the underlying process and focuses only on the surface features, the goal sought is, “the right-hand end of the horizontal line touches the edge of the steamgenerator box.” The rule the operator then follows is to move the horizontal line down until the goal is met. If the persons controlling the process have thermodynamic expertise, they could apply the laws of thermodynamics to the process. The display would show the entropy of steam leaving the process being too high. By treating the display as a conventional T-s diagram, the operators can calculate the amount of heat-input necessary for the output-steam to be in the desired state. They could then adjust the heat-input to the level that would meet the goal. While the above arguments may seen to be quite plausible, is there any evidence for a display based on the mapping principle being superior to a conventional display? Vicente et al. (1996) conducted an experiment in which the performance of persons using a Rankine-cycle display on a simulated process was compared to

135

those using a conventional display.53 The same set of 35 primary variables was represented in both displays. The two main classes of subjects were experienced NPP (nuclear power plant) operators and thermodynamic experts who had no practical experience in NPP supervision. They found that all users of the “mapped” display performed significantly better at detection and diagnosis of transients. Despite the operators negative comments regarding the Rankine display, which the experimenters suggested was due to it being very different to current control-room designs, operators using the Rankine cycle performed significantly better than the operators who used the conventional display.54

4.6

Human-Computer Interface for an HIPSS

What lessons can be learnt regarding the mapping principle applied to supervisory control of continuous processes that are relevant to discrete manufacture? In the display for supervisory control of the NPP, the various physical subsystems involved in energy production are nested as boxes around the Rankine cycle. These boxes act as references to the individual subsystems (reactor, heat exchanger, steam generator, superheater, turbine and condenser). The operator can easily perceive which component the primary data refers. The boxes therefore decompose the display into the separate functional components. The whole system and its parts are on display concurrently. The size of these static boxes indicates the upper and lower temperature constraints of the subsystems (Hansen, 1995). Information that relates to the higher levels of abstraction in the means-ends hierarchy (Table 2) is displayed as aggregates of lower level information (e.g., through appropriate principles for perceptual organisation). Therefore, multiple

53

Vicente et al. (1996) report that there exists an on-line application of a Rankine cycle display and two cases of its application on advanced control-room simulators. 54

Note that the conventional display used by Vicente et al. is conventional only in the type of objects used for displaying information: otherwise, it is not at all like that usually seen in a control room.

136

levels can be visible at the same time in the interface (Vicente, 1991). Operators can therefore aggregate or decompose the display along either a part-whole or means-ends dimension (Rasmussen and Pejtersen, 1995). A basic precept can be drawn regarding the application of the mapping principle. Displays that use nesting55 may be designed so multilevel information is displayed simultaneously (cf. Flach, 1988, 1990; Gaver, 1991). Higher level information emerges from the aggregation of lower level information (Vicente, 1991). The semantics of the work domain (i.e., the scheduling environment) are mapped onto the geometry of the interface, to reveal the affordances of the work domain in a way that exploits direct perception (Vicente and Rasmussen, 1990). As the constraints relating to system functions (acquired by sensory instruments) do vary, they are mapped to configural elements in the display (e.g., the position of corners of polygons). Thus, changes to the shape of a configural element accompany changes in the values of underlying functions (Hansen, 1995). As decision-making consists of cycles of recognition and action, the interface needs to afford actions. An interface that allows users to act directly on the representation of a functional component may engender a feeling of direct engagement in controlling the underlying referent (Shneiderman, 1982; Hutchins, Hollan, and Norman, 1986). Users feel that the representation is the thing itself (Norman, 1986). That is, the representation successfully stands for the thing in itself, whether concrete or abstract. The nested functional elements are permanent forms in the display’s background. They act as containers for particular features in the dynamic (configural) figures that move over them. They demark the boundary constraints (e.g., permissible maximum and minimum operating temperatures). To meet the performance constraints of the system, an operator applies controlling actions so that the movement of each critical feature of a dynamic figure is constrained within the bounds of its container. Functional relations are shown as connections (e.g., the

55

See Gibson (1979) work on how people use nesting in their visual perception of the natural environment. Vicente and Rasmussen (1990) extended the use of nesting to artificial environments.

137

connecting lines in the configural polygons in the above NPP example). Connections are not restricted to lines that join display elements of the primary data. They can be in various forms, for example, lines that intersect or coincide (Hansen, 1995). The display should be designed so that it is patently clear when the system is in a goal state: properties such as linearity, alignment and symmetry are particularly revealing (Hansen, 1995).56 Deviations from the goal state would then be obvious. In the heat exchanger, for example, alignment of the polygons is a conspicuous sign of the system meeting its goal, that of steady-state behaviour. Horizontal lines are especially effective in revealing the goal state as deviations are easy to detect. In supervisory control of NPP, operators have no interest in entropy, nor generally comprehend its meaning. Nevertheless, a display that consists of a temperature-entropy (T-s) diagram would have better perceptual features than a pressure-temperature (P-T) display. The goal is met when the part of the Rankine cycle contained in the steam-generator representational element is but a single horizontal line. Conversion of liquid to steam occurs at a constant temperature and pressure (Vicente et al., 1996). In the supervisory control example, an operator’s predominant activity is monitoring the process for deviations from steady-state behaviour. Where performance is aberrant, the operator intervenes to bring the system back to normal steady behaviour. This is quite different to scheduling activity. The scheduler’s task is not the maintenance of a steady state. Instead, the scheduler organises work to satisfice performance criteria that may be neither wholly quantitative nor completely specified. In process control, the goal is to satisfy the intentional constraints defined by physical laws. In scheduling, the intentionality derives from the scheduler’s purposes and values, which may include various subjective preferences of the scheduler and the organisation.

56

See the chapter on the “Hybrid Scheduling Interface: the Screen as the Discourse Medium” for a full discussion on the relationship between the graphical features of objects and affordance.

138

The potential for hybrid-intelligent production scheduling was discussed in Chapter 3. Scheduling activity was described as the management of constraints within environments that are complex and perplex. In planning the changeover of batches of work, schedulers may spend 80% to 90% of their time identifying problem constraints that are relevant in the current context. Some constraints are inviolable while others are just preferable. Preference constraints are tied to performance criteria, for example, to maximise machine utilisation, the order of jobs may be constrained so that there is no time lost in making minor changes to the set up of the machine. As a means to some desired end, a scheduler may decide to relax some preference constraints: to satisfy the delivery requirements of a customer, the scheduler may allow the constraint on minor set-ups to be relaxed. Constraints may also be imprecise: release and due dates are often vague. Only a few constraints are normally supplied facts. Most constraints consist of semantic relationships formed from inference and induction. Using intuition, schedulers fill in the blanks about what is happening, and, what can and what will happen, on the floor. The scheduling environment is not static. Some changing factors are the distribution of the values of job attributes, the capabilities of machines and the market environment. Consequently, representations in memory are not static (Young and McNeese, 1993). Rasmussen, Pejtersen and Goodstein (1994) see them as a kind of semantic network that evolves and changes as the situation progresses. Since shifts among strategies are frequent, the mental model must consist of many fragments that replace each other repeatedly. It exists only as a work space representation of potential means-ends relations and the relevant properties of each item. Consequently, the mental model is not one stable representation of the scheduling domain, but changes as the strategy in a particular situation unfolds. The behaviour of schedulers is primarily driven by their recognition of the evolving situation (Woods, 1995). As schedulers find both familiar and unfamiliar situations, they need to be able to move between skill-, rule- and knowledge-based levels of decision-making (Rasmussen, Pejtersen and Goodstein, 1994). In an HIPSS, the computer has to support human schedulers as they move between

139

these levels. Information must be available to support all strategies that a scheduler may find useful. How do the display concepts from supervisory control relate to the discrete manufacturing domain? By applying the mapping principle, a computer system can be developed that supports activities independent of any pre-definition of problems and pre-planning of responses (Vicente, 1990). The mapping between means-ends levels must be represented at the interface. In developing a schedule, the scheduler has to perceive the current state, the target states (i.e., states that satisfice the goal), and the scheduling constraints. To recognise the status of the scheduling activity, a scheduler attunes to information in the display. Which signals, signs and symbols the scheduler is attuned, depends upon the information-processing activity and the scheduling goals that are relevant for the current situation. For each goal, there is a scheduling policy (i.e., an operational policy) and some measure of performance. The goal, for instance, may be full utilisation of machines. Percentage utilisation would then be the associated measure of performance. The scheduler may therefore seek the minimisation of makespan as the operational objective. To meet this objective, the scheduler could apply a bin-packing policy. To follow the policy, the scheduler would focus on each job’s processing time and the accumulated processing time on each machine. His/her attention would be drawn to objects that afford perception of processing times. If, however, the scheduler is motivated, perhaps from managerial pressures, to ensure that a particular customer obtains exceptional service, then recognition must include the customer’s name and the due date for the job. The scheduler, attuned to these additional constraints, would seek objects in the computer display that would provide affordances associated with this information.57

57

There may be different affordances for one piece of data. For example, a screen object may afford reading the customer’s name for a particular job and simultaneously afford the recognition of a pattern among jobs for the same customer. This will be clearer when specific features are discussed later.

140

To meet an intentional goal, for example, the minimisation of makespan, the perceptual structure of the surface must be similar to the cue description for the processing time of each job. The processing time for each job is mapped to a perceptual surface feature (e.g., as depicted in a Gantt chart). The scheduler can then use the display to make inferences on how to meet the intentional goals. The decision has to meet the functional constraints: the physical and logical constraints of the manufacturing process (e.g., no two jobs can be processed on the same machine simultaneously). Unlike process control, which had clear constraints, the goal in scheduling cannot be displayed as an underlying fixed boundary. While constraint boundaries may be shown, some may not be violated while others may with some penalty.

4.7

Summary

In this chapter, a model of human-computer interaction in an HIPSS was developed using Sheridan’s supervisory control model as a foundation. Scheduling was placed within a systems-thinking context in which human schedulers make decisions through purposeful rational action. As manufacturing systems are products of purposeful action, they can be modelled using means-ends relationships. As scheduling activity is characterised by complexity and perplexity, it is impractical to explain and predict a scheduler’s behaviour using functional analysis. Intentional models are better suited for describing the purposive action of schedulers. Their perception of a manufacturing system varies across a means-ends hierarchy and is dependent on the type of reasoning they invoke. In making decisions, their reasoning may be either skill-, rule- or knowledge-based (SRK). Physical properties (e.g., job attributes and the processing capabilities of machines) propagate up the means-ends abstraction hierarchy placing functional constraints on a scheduler’s purposive action. A scheduler’s intentions constrain the behaviour of the manufacturing process (i.e., deciding when and where a job will be processed). Scheduling activity consists of cycles of recognition and action that are interrelated through a decision ladder of nodes and interconnecting arcs, of which a specific form is the Model Human Scheduler (MHS). The various branches in the decision ladder represent the

141

different forms of behaviour: skill-, rule- or knowledge-based. The MHS is a meta-framework that uses broadly defined production rules that accommodate various representations of cognitive activity. It is a useful tool for analysing the interactive scheduling process in an HIPSS. The construction of a means-ends hierarchy of the work domain and an activity analysis of scheduling behaviour provide a framework for developing an HIPSS. In an HIPSS, scheduling activities are shared between human and computer. The degree of sharing and the interaction between human and computer must facilitate flexible and opportunistic decision-making. In an HIPSS, the computer display has to support a scheduler’s attunement to the properties of the physical system at the appropriate level of abstraction. A scheduler seeks different information from a display depending upon where he/she is in the decision ladder. The information required to recognise which heuristic is appropriate is different to the information required to carry out the heuristic’s procedural steps. Therefore, a scheduler seeks different patterns in the data at different conjunctures. When a scheduler is looking at the computer screen, whether he/she is attuned to signals, signs or symbols depends upon the type of SRK reasoning he/she is currently applying. To support the various types of reasoning, the display needs to provide perceptual affordances that allow the scheduler to effortlessly extract information required for any level of the abstraction hierarchy. By mapping surface features of the display to functional and intentional constraints, schedulers can perceive information at multiple levels of abstraction. In the next chapter, a practising scheduler’s perception of the scheduling process is explored through a field study of the scheduling of a job shop. From the type of scheduling strategies he employs, some generic traits of the scheduling activities of human schedulers in discrete manufacture is extracted. Through a work domain analysis and activity analysis of the data from the field study, the architecture for an HIPSS is developed in Chapter 7.

142

Chapter 5 Field Study

As described in Chapter 3, job shops are stable for only short periods. Jobs arrive unheralded and with short lead times. Factors other than those used by OR scheduling rules dominate a scheduler’s behaviour. It was argued in Chapter 3 that job shop scheduling becomes the management of constraints to satisfice desired outcomes. By choosing to place a particular job on a machine, a scheduler places a temporal constraint on the machine: while the machine deals with the job, it is unavailable for other jobs. Commonly, schedulers use either machine loading-boards or Gantt charts to make these reservations. A field study of a practising scheduler in a job shop was undertaken: 1. To find supporting evidence for the proposition that scheduling is characterised by the management of multitudinous constraints; 2. To obtain data for a Cognitive Work Analysis (CWA) of job-shop scheduling. An ends-means description of the physical functions and physical resources of the job shop provides a partial Work-Domain Analysis (WDA). A discussion of the field of constraints that the scheduler has to work within leads to the goals that he seeks and the operational policies he applies. The myriad goals are then consolidated into a structure that links goals at various levels of abstraction. The discussion on scheduling behaviour moves from the scheduler’s statements of his practice to observation of his behaviour. The analysis includes an assessment of performance for some standard OR criteria. The results are then compared to benchmark values found from a simulation of a heavily relaxed model of the job shop. From a discussion of the type of scheduling strategies employed by the scheduler, inferences are drawn regarding the decision-making activities of human schedulers in discrete manufacture. The company used for the field study produces continuous stationery (Higgins 1992). Typical products are forms for invoices or cheques on fan-fold paper. As the company

143

desires anonymity, its nom de plume is Melamed.58 Like many small jobbing-shops, at Melamed: • Some machines are state of the art, but not in the realm of Advanced Manufacturing Technology; • There is no on-line monitoring of processes; • A plant-wide computerised database holds details on each job. • A machine-loading board is used as a scheduling aid. The scheduler, Neil, was also the production manager. As he was responsible for the management of all production, there were endless interruptions to all his activities, including scheduling. Neil’s behaviour was studied to see how he addressed the scheduling task.

5.1

Methodology

A discursive approach was used to analyse the data collected through interview and observation. Although it does not have a definitive methodological structure that allows findings to be unequivocal, it affords insight into the complexity and the perplexity of scheduling an actual job shop. While there is loss of rigour and certainty in field studies, Klein (1989) cogently argues that they have ecological validity and a richness of observation. They act as a balance to the use of laboratory studies in which university students perform context-limited and unfamiliar tasks that are generalised to applied settings. The study had three parts: 1. An initial investigation consisting of inquiry and observation; 2. A pilot study; 3. An intensive study.

58

The name is in recognition of Uzi Melamed, who generously arranged the field study with the printing company and helped collect the data, during his sabbatical from Rafael, a missile manufacturing plant owned by Israel’s Ministry of Defence

144

The initial investigation was used to find the rules that Neil applied in building a schedule. During this period, the two investigators observed Neil’s behaviour over a day and asked detailed questions regarding reasons for observed actions. Both observers were professional engineers, each with over twenty years of experience: one having broad experience in the control of manufacture and the other having responsibility for the production control of a large manufacturing plant. To clarify the conditions and reasons for scheduling moves, they also put forward hypothetical scenarios. Four weeks later they conducted a follow-up interview to find answers to questions that arose during the analysis of the transcript of the initial investigation. For both the pilot and the intensive study, Neil’s use of the machine loading board was observed for three days. On being alerted to a new event, Neil made changes to the machine loading board. After he had finished altering the board, the observers asked him why he made the changes. This technique elicited the local heuristics he applied, and the choices made between competing rules. The observers recorded both the movements of tags, which are cards showing job details, on the board and Neil’s reasons for making the moves. A week after the study finished, they collected the tags and a computer database file, which included the production details for the jobs processed during the study. At first, there was an attempt to record the think-aloud utterances made by Neil: however, it was soon dismissed as the endless interruptions made this method untenable. This forced a change in the data collection method to one of observation and inquiry.

5.2

Description of the Printing Environment

In offset printing, each colour to be printed requires three cylinders to transfer the image to paper: plate, blanket (offset) and impression cylinders. The printing plate, which holds the image, is clamped around the plate cylinder. Moistening rollers deposit moisture onto the plate. Then inking rollers pass over the plate. The water-holding areas of the plate reject the ink while the greasy image accepts it. The inked image then transfers to a rubber blanket. The impression cylinder presses the paper into contact with the blanket cylinder. The ink is then offset to the paper travelling around the impression cylinder. By

145

combining cyan,59 magenta, yellow and black, the press can produce any colour; these are called process colours. Rather than combining colours to produce a desired hue, printers often use pre-mixed colours that comply with a standard: the Pantone Matching System (PMS).60 There are two types of offset press — web and sheet-fed. Web presses operate at a highspeed and can print on both sides of a continuous roll of paper. They generally have inline binding and cutting features. After impressing the image on the page, a web press may either place perforations across the sheet to produce fan-fold forms or cut the paper into individual sheets. On fan-fold forms made for track-fed computer printers, the press also places perforations on each side of the sheet and makes holes required by the sprockets. The final product is either fan-fold paper or stacked sheet. Sheet-fed presses operate more slowly than web presses, but are faster to set up (known as “make ready” within the trade): accordingly, they suit short production runs. Figure 36 shows the configuration of the shop at the beginning of the study. The Akiras are web presses and the Tridents are sheet-fed. The Akiras vary in the number of colours they can print. Significantly, Akira 4 was not used for printing; it only converted continuous paper into sheets to be used on Tridents or sold as blank paper.

59

Also known as process blue.

60

As well as the four process colours, Melamed frequently uses reflex blue, warm red and PMS 423 grey, 470 brown, 354 green, 347 green, 021 orange and 032 red.

146

HUNKELER Akira 1 3 colours

Trident 1

(cut-outs, die-cut windows, peel-off adhesive labels, pockets, gluing, )

SANDEN

Trident 2

Akira 2 4 colours

Trident 3

(line glue)

OTHER COLLATORS

BOWE (cutter)

Akira 4 CONVERT

Akira 3 6 colours

MINAMI (number, glue, file-hole punch, envelopes)

Machine Loading Board

GENERAL WORK

window

Figure 36. Original configuration of machines.

A principal resource constraint is the Table 3. Number of cylinder-sets for the restricted availability of cylinders (Table 3). Akira presses. Each colour requires a cylinder set. For Number example, a maximum of six colours can print Cylinder Size of sets concurrently on 297mm-depth sheet across 10 11” (279 mm) the four Akira presses. Where a six-colour 6 11²/3” (297 mm) job requiring this size cylinder is run, no 1 13” (330 mm) other press can process jobs of like size. 6 14 ²/3” (372 mm) Where all the cylinders are serviceable, this 10 17” (432 mm) restriction is inconsequential, as only 1% of jobs require six colours. However, there is little leeway for withdrawing malfunctioning cylinders. While they accept sheet-based jobs, Melamed sees their market niche as the provision of a fast turnaround service for fan-fold forms. Their perception is that their competitors’ larger size makes for organisational inertia that hampers the turnaround time.

147

HUNKELER Akira 1 Trident 1

2 colours

(cut-outs, die-cut windows, peel-off adhesive labels, pockets, gluing, )

SANDEN Akira 2 4 colours

(line glue)

OTHER COLLATORS

BOWE (cutter)

Akira 4 1 colour

Akira 3 6 colours

Machine Loading Board

MINAMI (number, glue, file-hole punch, envelopes)

GENERAL WORK

window

Figure 37. The configuration of the machines during intensive field study.

During the field study, the configuration of the shop changed. Three Tridents were taken out of service and a colour-tower was moved from Akira 1 to Akira 4 (see Figure 37). Each tower contains the set of cylinders for printing one colour. The reduction in sheet-fed machines reflects Melamed’s policy not to compete with the many companies specialising in small sheet-fed jobs. With the change, the capacity to produce single-colour fan-fold jobs increased by a third. However, the capacity for three-colour jobs decreased by a third.61 Only the Akiras could print fan-fold jobs, although any press could run the sheeted jobs. The Akiras, however, being web presses must be fed paper in continuous form. After printing, the paper is cut into sheets. Cutting may take place on the press itself or as a separate operation on the Bowe cutter.62 For jobs that are printed side-by-side, that is, two

61

While about 40% of jobs run on the Akiras were single colour, only 10% were threecolour. 62

The Bowe cutter was used on 9% of jobs.

148

abreast, they may be cut during collation. The input to the Trident is stacked sheet, which the shop obtains in sheet form or converts from continuous paper. Fold Special Finish Procure Continuous Paper

Continuous printing

Output Fan-fold forms

Perforate Collate & finish

Cut

Cut into sheet Procure Sheet Paper

Output Sheeted forms

Sheet printing

Figure 38. Major physical functions in printing.

The production process at Melamed consists of the major physical functions shown in Figure 38. Finishing and special finishing can be broken down into separate procedures. The arcs show the order of operations.63 Figure 39 shows the major resources used to perform these functions. The closed and open arrows on the arcs show the routing for fanfold and sheeted paper, respectively. The dashed arcs from Akiras 1, 2 and 4 signify that their links will be the same as those shown for Akira 3. The four Akiras are alike, except for the number of colours — one, two, four and six — they can print and the ancillary attachments that provide extra functionality. The ancillaries place additional constraints on the allocation of jobs to presses. For instance, all jobs using ink that needs ultraviolet (UV) fixing must go onto Akira 3, the six-colour press. Where a job has multiple parts, they can run either consecutively or separately on

63

The operation labelled “Cut” is shown separate from “Cut into sheet”, as it includes trimming. Trimming, for example, occurs when ink is bled to the edge of the sheet. To lay ink to the edge, the press applies ink to oversized paper.

149

one machine. Equally, the split can be between machines.64 A collator then joins the parts. Which collator is used depends upon the job's requirements as they differed in their capabilities. Store

Reel of paper

AKIRA 1 (2 colour)

Store

Reel of paper

AKIRA 2 (4 colour)

Store

Reel of paper

AKIRA 3 (6-colour)

Reel of paper

AKIRA 4 (1 colour)

Store

Hunkeler

Minami

Sanden Store

Stack of paper

Boxed Fan-fold forms

Bowe

Boxed Sheeted forms

Trident

Figure 39. Major physical resources in printing.

Arcs in Figure 40 show the Ends-Means relations between the physical functions and physical resources. For simplicity, the four Akiras are lumped into a single generalised representation. Note that for some operations different resources can be used (e.g., folding, collating and finishing). In some cases, they are alternatives (e.g., in many cases it is immaterial which collator is used). For others it depends upon the specific operation, for example, the Hunkeler can dye-cut the paper and glue transparent window on envelopes and Akira 3 is the only resource that can apply ultraviolet light to cure special inks. Arcs from the “Special Finish” to the Hunkeler and the generalised Akira indicate these EndsMeans relations, respectively.

64

For fan-fold forms, where the jobs are multi-part the split may between individual parts, which are brought together on the collator, or boxes can be split (fan-fold paper is boxed and a number of boxes may be in a job). The sheeted jobs conveniently split into boxes.

150

Fold Special Finish Procure Continuous Paper

Continuous printing

Output Fan-fold forms

Perforate Collate & finish

Cut

Cut into sheet Procure Sheet Paper

Output Sheeted forms

Sheet printing

Hunkeler

Store

Reel of paper

AKIRA Minami

Sanden Store

Stack of paper

Boxed Fan-fold forms

Bowe

Boxed Sheeted forms

Trident

Figure 40. Ends-Means relationships between physical functions and physical resources.

The database file obtained from the Melamed had information on 659 jobs in sufficient detail for analysis.65 Figure 41 shows the statistical profile of the resources used in processing the jobs listed in the database. The web presses printed 84% (555) of the jobs; 32% of these underwent no other operations. Of jobs processed on the Akiras, 14% were mere conversions of continuous paper into sheet; 80% of these were feedstock for the Trident and 26% of the remaining underwent other operations (e.g., collating or cutting).

65

This covered the period between the 5th March and the 11th June 1992.

151

printed 21 printed 555 converted 94

blank 1 printed 24

printed 2

Reel of paper

AKIRA

blank 4 printed 220 blank 72 printed 3

Store

Stack of paper

Trident

Minami

Bowe

blank 14 Boxed Sheeted forms

Sanden

blank 3 printed 94 5

10

Boxed Fan-fold forms

Hunkeler

2 Store

printed 33

26

3

1

printed 85

Figure 41. Statistical profile of resources used in processing the jobs listed in the database.

5.3

Characterisation of the Scheduling Problem

Neil focuses primarily on the web presses for two reasons. The major scheduling bottlenecks are located there. Neil can often choose between machines when allocating jobs. He uses this flexibility to manage downstream performance. A third of the jobs placed on the Akiras require no further processing. Moreover, the secondary operations (collating, cutting, etc.) vary between jobs. Through judicious placement of jobs on the presses, Neil controls the flow of jobs to downstream machines. He therefore has a means for regulating the formation of bottlenecks on secondary resources. For each job he schedules, Neil has to decide to which machine he will allocate it and then where to place it in the queue of jobs. In ordering the sequence of jobs he considers time lost to set-ups. Three job attributes affect the set-up time: Depth: The size of the printing cylinders can only an exact multiple of the depth of the form. Cylinders take about 40 minutes to change.

152

Width: The perforations between sheets cannot extend to the edge of the paper. Where the width reduces between jobs, the operator can quickly break teeth off the perforating tool to stop holes being punched. However, where the width increases between jobs, the operator has to replace the perforating tool and then remove teeth near the edge. This takes about seven minutes. Colour: Washing the colour applicator and refilling it with another colour takes ten minutes. Mixing the colour so that it is uniform takes ten to fifteen minutes depending on amount of ink required and the combination of colours. As changing cylinders takes a long time compared to the median processing time of 100 minutes, depth is a crucial determinant in Neil’s decision making and therefore identifiably a major set-up. Changing width and colours are minor set-ups.

Table 4. Set-up times for Akira presses.

Factor

Set-Up Time (minutes)

45 file punch While every job has a value for depth, width and 45 stop perforator colour, many other factors also affect set-up. Some rings occur infrequently, for example, setting the paper 6 cross perforator 66 folder, the file punch and the chopcutter and 38 cylinder change rewinding the paper after printing. Other factors may reel change 27 arise many times during the printing of a single job, 10-15 mix PMS colour for example, plates are set up at the start of a job and wash up 10 replaced as they wear. Neil uses a table of set-up 12 change plate times to determine time lost for commonly occurring side-by-side 20 changes (Table 4). He relies on his printing 20 folder chopcutter 6 experience to estimate appropriate set-up times for other changes to the configuration of a machine. From the expected wear life of the plates, he estimates the number of changes of plates to process the job.

66

The chopcutter is a perforator that is an added fixture engaged by a gear. It is used for making tear-off parts on forms.

153

Setting up a machine consists of modifying the configuration to match the constraint requirements of the job that is to be processed. There are other inviolate restrictions, as discussed in chapter 3. For example, the number of colours a press can print is not readily alterable and hence is normally invariant. Nevertheless, a press can print jobs that require more colours than its capacity by making multiple passes. However, this requires the washing of colour applicators between passes. By using a press that has insufficient colour capacity, the machine has to be reset during the processing of the job. The processing time multiplies by the number of passes. Therefore, the technical constraint, the colour capacity of the press, results in causal restrictions, the number of passes and wash ups. Neil not only thinks about these constraints but many others also: a non-exhaustive set is shown as IF-THEN production rules in Table 5. Table 5 A non-exhaustive set of production rules for the technical constraints IF-THEN Production rules 1.

If

the job can be produced in multiple passes

Then

the number of passes is the number of colours to be printed divided by the number of colour applicators on the press rounded upward to the nearest integer.

2.

3.

If

the job requires one colour

Then

it can be processed on any press, Akira or Trident.

If

the job requires one colour and it has to be on continuous form

4.

Then

It can be processed on any Akira press.

If

the job requires the printing of two colours per pass and it has to be on continuous form

5.

Then

it can only be processed on either Akira 1, 2 or 3.

If

the job requires the printing of three or four colours per pass and it has to be on continuous form

6.

Then

it can only processed on either Akira 2 or 3.

If

the job requires the printing of five or six colours per pass and if the colours are not to be produced by overlaying the four process colours and it has to be on continuous form

Then

it can only processed on Akira 3.

154

IF-THEN Production rules 7.

If

the job is to be produced in a single pass and its colours are to be produced by overlaying the four process colours and it has to be on continuous form

8.

Then

it can only processed on Akira 3.

If

the job is on sheeted form and the paper supply is on a reel

Then

the operation can be split between an Akira and the Trident, with the final pass being produced on the Trident.

9.

If

the job is on sheeted form and the paper supply is a pack of single sheets

10.

11.

12.

Then

it can only printed on the Trident.

If

the jobs have multiple parts

Then

the job may be split between machines (i.e., presses or collators)

If

the job requires UV curing

Then

it can only allocated to Akira 3.

If

the job requires the same cylinder size as the preceding job and its width is less than its predecessor

13.

Then

the cross-perforators have to be replaced.

If

there is no colour applicator on the press that has a colour required by the job and there is no clean applicator

14.

15.

Then

an applicator has to be washed and the colour mixed and added to the applicator.

If

Colour being replaced in the applicator is similar, but different, to the new colour

Then

Half a wash may only be required67

Else

a full wash is necessary.

If

the job’s width is no greater than 241mm (or under exceptional circumstances 245mm) and the paper is of continuous form

Then

67

the forms can be run side by side across the sheet.

Note that this is not explicit as it depends upon the extent of the similarity.

155

IF-THEN Production rules 16.

If

the number of cylinders available of the required size is less than the colour capacity of the press

Then

the number of colours that can be printed in a single pass is equal to the number of available.

Simply including set-ups for depth, width and colour transforms a difficult task into one that is particularly problematic even for a simplified representation of the situation at Melamed. The case of four equal parallel machines (i.e., if the presses all the same colour capacity) is not well understood for objectives that minimise the number of tardy jobs or average tardiness.68 When the number of colours each machine can print is considered, the problem becomes even more arduous.69 The number of Akiras in parallel that a job “sees” depends upon the number of colours used in its production (see Table 6). This situation, while not extraordinary, constitutes a difficult class of an identical parallel-machine problem (French, 1982). While there have been some theoretical advances in allocating and sequencing jobs on parallel machines where there are major and minor set-ups, they apply to situations that are far simpler than this case-study.

68

OR researchers have found heuristics applicable to parallel machines with set-ups that minimise average lateness. Where all jobs are tardy, these will also minimise average tardiness. 69

Eight process colours would be required to produce any colour on both sides of a sheet. At Melamed, however, premixed colours tend to be used and process colours are treated as basic colours rather than mixed for “photographic” reproduction. Jobs requiring mixing of process colours (typically advertising forms) are produced on Akira 3: the six-colour press.

156

5.3.1 Machine Loading Board Two other companies occupy the same building as Melamed: a producer of artwork and a manufacturer of printing plates. Table 6. The relationship between

When a sales representative obtains an order from number of colours and the number of parallel Akira presses. a customer, a job number is allocated. If necessary, artwork is requested from the Colours Number of Jobs70 associated company. This service is to used for Machines 71 about 60% of jobs. The rest either require no 4 42% 1 artwork (e.g., already on file) or is supplied by the 3 42% 2 customer. On completion, the artwork returns to 2 10% 3 Melamed. Neil then places an order for plate 2 4% 4 manufacture. The job “arrives” at Melamed on the 1 0% 5 receipt of its plates, as it is then available for 1 1% 6 printing. Neil tracks jobs as they progress from artwork, to plate manufacture and finally printing and finishing at Melamed. A machine loading board assists him in tracking the jobs and planning the production order of the required operations undertaken by Melamed. The board is about five metres wide by a metre high. It has vertically slotted tracks that can hold job-tags in columns under various headings. There is one tag per job. Jobs progress left to right across the board from Art/Proof (six columns), On-Hold, Plates (two columns), Akira 1, Akira 2, Akira 3, Akira 4, Trident, Minami, Sanden, Hunkeler, Bindery, Renoma, and Despatch. The pillar supporting the roof splits the board between Art/Proof and Plates. Neil described the board left of the pillar as his window into the room behind the wall, the locale of the art company as the tags show the jobs currently

70

These figures come from the 106 jobs in the database that have the number of colours recorded. 71

Of the 628 jobs in the database that had been delivered (a sign that the customer did not pull the order), 358 jobs had in-house artwork, and 234 required none.

157

being undertaken by the art company. For many jobs (40% of jobs in the data set), Neil can anticipate their arrival by metaphorically peering into the art room. Neil places each tag under the heading for the machine assigned for the next operation. The vertical order of the tags shows the order of processing. At the beginning of a shift, Neil tries to project a schedule across two shifts. As the shift progresses, he modifies the plan in response to new events.

Figure 42. Job Tag.

Each tag displays characteristics of the job that Neil may want to take into account when making a scheduling decision (see Figure 42). The following fields are listed: • job attributes: due date, job number, customer, description, quantity, parts, size (depth, width), method (fold, sheet, rewind, single, 2 wide) colours (F.O.B. {Front of Bill}, B.O.B. {Back of Bill}). • shop-floor resources: press, Minami, Sanden, Hunkeler, Bowe. • external resources: artwork, proof, plates. • other: sales representative. The tags are colour coded to the size of the printing Table 7. Colour of tag relates to the cylinder size. cylinder, which is the unit that takes the longest time to Colour Cylinder Size change on a press (see Table 7). The amount of information shown goes beyond the rudimentary 14²/3 inch (372 mm) yellow information that machine loading boards typically 17 inch (432 mm) pink display (see Chapter 3); however, not all significant 11 inch (279 mm) green information is shown. Missing from the tag are 11²/3 inch (297 mm) white orange out-sourcing attributes relating to the paper: its width, density and type (surface finish and colour). These are important as changeover of paper affects the set-up time. Furthermore, from the information displayed other attributes have to derived, for example, 158

manufacturing time. Therefore, Neil is burdened by complementary calculations, which he performs on paper, by calculator or solely in his head. To determine the manufacturing time (set-up time plus processing time) Neil uses three fields on the tag (quantity, parts, depth) and two look-up tables — set-up times (Table 4) and machine speeds (Table 8) — that he has committed to memory. He uses the following formulae to calculate the number of impressions, Ni, and the manufacturing time, Tmanu, in minutes: N c N p N f Dcyl Ni = Df Ns

Tmanu =

60 N i Dcyl

+ ts 1000v where: Nc is the number of passes to transfer all the colours Df is the depth of the form in Table 8. Speed of Akira press. millimetres Impressions, Ni Machine Dcyl is the cylinder size, that is, the speed depth of the cylinder in millimetres v (m/h)72 Np is the number of parts in a form 3000 1000 - 2000 Nf is the number of forms 4000 2000 - 5000 Ns is the number of forms side-by6000 5000 - 10,000 side 6500 10,000 - 20,000 ts is the set-up time for the press in 7000 20,000 - 30,000 minutes 8000 30,000 - 40,000 v is average machine speed of the 40,000 + 9000 press in metres per hour The machine speeds in Table 8 are standard values based on normal operating conditions. The maximum permissible speed can vary. It depends upon the drying time of the ink, which varies with the density of the colour, its chemical constituent, the ambient temperature and the relative humidity. When operating conditions vary significantly from the norm, which are infrequent, Neil using his experience adjusts the estimated

72

Reel changes are included in running time.

159

manufacturing time. The production record of jobs processed during the intensive field study can be used to appraise the efficacy of the formula. A shortcoming with the database was the lack of information on the set-up conditions for each job, apart from changes to cylinder size. Hence, the following conditions were assumed for each job: a single platechange per colour, the setting up of the folder and the loading of a single reel of paper.73 Figure 43 shows the estimates plotted against the actual times listed in the production record. A line of best fit obtained from regression analysis roughly coincides with the formula’s prediction. As expected, correlation between predictive estimates and actual manufacturing times is not one-to-one. In addition, the residuals are skewed for smaller jobs, which suggests inaccuracy in estimating set-up time. The correlation coefficient and the standard error were 0.90 and 85 minutes, respectively. For a production run of 190 minutes, which is the mean estimate, the standard error gives a variation from the predicted manufacturing time of 45%. Nevertheless, within the bounds of the assumptions and the uncertainty in production performance, the formula performs well as an estimator to guide scheduling decisions.

73

For all jobs, a folder change is included when estimating the cost. A single change of the reel of paper is a reasonable assumption where the number of forms is small. For large jobs, the error in set-up time due to the assumption being incorrect is small compared to the processing time. By ignoring changes to colour applicators, as they do not occur every time, the formula tends to underestimate the manufacturing time. By ignoring plate wear and assuming a single plate change, time required to set up plates may also be underestimated.

160

1200

1000

800

Actual (minutes) 600

400

200

Linear fit If actual=estimate

0 0

200

400

600

800

1,000

1,200

Estimated (minutes)

Figure 43. Estimated versus actual manufacturing times.

In practice, Neil does not always use the formula as written. Often from his experience with similar jobs he knows how long a job will take without calculation. This fits the mental look-up table representation of decision making in supervisory control discussed in Chapter 2. Frequently in applying mental arithmetic, he resorts to approximations to simplify the calculation; as he adds up the set-up times, he rounds off each time to the

161

nearest ten minutes. Nevertheless, when timing is critical, he uses a calculator to obtain greater accuracy.

5.3.2 Use of the Board for Scheduling In constructing a schedule, Neil focuses his attention on the plates and Akira-press sections of the board. When a job-bag arrives with artwork, he places the associated jobtag in the plates section, enters the date in the plates field, and then sends the job-bag to plate manufacture. If the tag exists, Neil merely moves it from the artwork to the plates section of the board, otherwise he has to write up a new tag for the job. Because most jobs return within a day, Neil carefully considers both jobs that are currently available and those that he has sent for plate manufacture when he constructs or amends a schedule (see Table 9). The plates section is subdivided based on Table 9. The flowtime for producing artwork and plates for jobs in the database. number of colours. Neil places the tags for jobs requiring more than two colours under Plates Artwork Akira 2&3 and the others under Akira 1&4. Sample size 217 78 Within these subgroups, he often groups the Days, minimum 0 0 job tags by cylinder size (i.e., by tag colour) Days, maximum 16 62 tentatively in the order that they will run on Days, average 2 11 the press. Sometimes instead, he groups them Days, median 1 7 by customer, if there are quite a few jobs for Days, 90% 4 24 the same customer. Such groupings act as reminders of particular orderings when he comes to allocate them to the presses. Sometimes, he may place a job tag under a specific press, even if plates are not available. It reminds him to leave sufficient time within a shift to produce a critical job. The placement of the tag within the list of waiting jobs depends upon the anticipated arrival time of its plates and estimated completion times for the preceding jobs. He takes only a cursory interest in jobs that he has sent out for artwork, as their effect on schedule construction is peripheral. His interest in these jobs only extends to decisive

162

activities, such as the purchase of materials74 and the setting of broad constraints on machine allocation — discussed later in the chapter — that may affect forward planning. It would be futile to include them in a detailed schedule, as the spread of completion times — indicated by the large variation in median, average and the ninetieth percentile — is too broad for accurate prediction. Besides, plate manufacture normally provides an adequate lead-time for schedule construction. When a job’s plates arrive, he crosses out the date in the plates field of the tag. He may either leave the tag under the plates section or move it to a press; whatever he does requires a conscious decision. As he marks the tags, he tends to scan the status of the board. From his appraisal of the schedule, he may decide to move some other tags to satisfy better some perceived objectives. Often Neil moves tags as a group. For example, he may leave a set of tags for one customer under the plates section until all plates have arrived. He then moves the tags as a group to a press.75 When the printing of a job is complete or nearly complete, he moves the tag to section of the board for next operation.

5.4

The Goals Neil Seeks and the Rules he Applies

To deduce Neil’s scheduling behaviour, the transcript for the initial investigation was analysed to find answers to the following questions. What goals does he seek? How does he act to try to meet his goals? The aim of the analysis is to show that practical scheduling is far from the simple construct shown in OR models.

74

If Neil can order paper for long runs a month in advance, he can buy off the mill instead of a supplier, thereby obtain a cheap price. 75

Neil tends to follow this procedure on jobs in which the customer receives a discount rate.

163

The investigators had only restricted access to Neil while he was carrying out his normal work activities. They had to make sure that they did not unduly obstruct either Neil’s or anyone else’s work. Notwithstanding, the data are sufficiently rich to show the complex form of the structural relationship between goals and actions. This limited objective does not require a thorough and exhaustive process that elicits all, or nearly all, the knowledge that Neil applies.76 Neil’s goals, and the rules for their attainment, were extracted from Neil’s utterances. In many cases, they are the operational objectives of higher-level unstated goals. What is meant by an “operational objective” can be illustrated with the Rankine cycle display, discussed in Chapter 4. For example, the goal an operator may seek is to keep the pressure in the steam generator within certain bounds is an operational objective. By meeting it, an underlying goal based on thermodynamic theory is also met. Operational objectives may be practices founded on goals and rules that may have never verbalised. They are part of the scheduler’s tacit knowledge, which has formed during years of work in printing. Operational objectives are at the level of surface expertise.77 Neil’s use of the concept of tardiness illustrates the difference between surface and deep knowledge. When referring to the degree of tardiness, sometimes he clearly meant the number of tardy jobs; however, at other times the investigators understood that Neil was speaking about average, median or maximum tardiness. It is surmised that Neil’s flexible use of the word tardiness did not indicate lack of clarity in thought or purpose, but was the surface manifestation of a deeper understanding of complex factors at play. The variation in the use of the term tardiness

76

Even under the application of a more intensive method of knowledge elicitation, completeness cannot be verified, unless all scenarios have been explored. A similar approach was taken by Hsu et al., (1993). In eliciting the domain knowledge for their MacMerl mixed-initiative scheduler, the “objectives were not stated explicitly, but were implicit in the methods used by the expert scheduler and were evidenced when the expert explained the method used to construct schedules.” 77

The relationship between surface expertise and tacit knowledge is discussed in Chapter

3.

164

could be explained if the following goal underpinned his behaviour.78 The number of tardy jobs is minimised, while satisfying two conditions based on tardiness: no job exceeds an acceptable value of maximum tardiness (which varies between customers); the average and median tardiness are within acceptable bounds. These conditions act as boundary constraints. From this perspective, the form of tardiness that the scheduler considers depends upon context. Where all jobs are due in a day or two and they can all be processed within an acceptable bound on tardiness, the scheduler would probably focus on minimising the number of tardy jobs. Where there is a broad spread in due dates and processing times, the maximum tardiness may vary greatly between schedule proposals although the number of tardy jobs may not. Under these circumstances, the scheduler may focus on reducing maximum tardiness. If the scheduler wants most jobs within an acceptable bound on tardiness (e.g., most jobs are to be no more than one day late), then he may focus on the median tardiness when checking a schedule’s suitability. Neil’s approach to scheduling is not that of waiting for a machine to become available and then dispatching the job with highest priority. Instead, he operates on a scheduling horizon of two to three shifts. For each machine, he constructs a queue of available, or imminently available, jobs. In planning a schedule, which goals does he try to meet? The following statement supports full utilisation of the presses being a primary consideration. [Akiras] is where the money is made. … What I need to do is to put on volume of work and turn it out every shift.

When allocating jobs to the presses, he also considers what collators they require, as he also aims to fully utilise the collators. I’ve got to think about keeping the collators going all the time. Not just the printing machines. I’ve got a choice, where possible I try to have two machines at all times in multi-part sets, because I know that 50% of the jobs that come out of the Trident room are multi-part jobs. They are little jobs so they tend to go on the little collator. The first thing I do is to look at the job. If there is any special

78

Whether Neil follows this goal or not is unimportant, as its purpose is to illustrate that plausible explanation may exist where there are discrepancies in surface knowledge.

165

collating, special things that have to be done they automatically go to the Minami and everything else where possible goes on the Sanden for straight collating. If I’ve got three machines in multi-part set …I’ll build up a backlog of work. I don’t want a backlog … I haven’t got the room and I want to invoice as much work as possible. So I can keep one machine in single parts, knowing that it’s going straight out the door all the time. The other two machines I keep in multi-part sets.

During this allocation, Neil tries to establish a sequence in which the same size cylinders are used and: Then I try to group jobs together where my paper width is the same, so I don’t have to move the sprocket holes. I just put a reel on and change the plate. And thirdly, if you got the paper set up for the same width and you have the cylinders set up for same width all you have to worry about is washing out the colours.

There are more than two goals in the above statements and they are intricately linked. They can be arranged into goals and subgoals. The primary goal is full utilisation of all machines. To endeavour to meet this goal, Neil separately seeks to minimise idle time on both the presses and collators. To minimise non-productivity time on a press he seeks another goal, minimisation of set-up time. In minimising non-productivity on the collators, his primary concern is the minimisation of waiting time. If the stream of multiple-part jobs dries up, the collators will have no work.79 He also does not want a large backlog of work awaiting the collators: he wants to minimise WIP. Minimisation of WIP is a subgoal of

79

Floor space places an upper bound on the amount of WIP.

166

“maximise jobs invoiced in the month.”80 Again this is a subgoal of another goal, maximise cash flow.81 Neil’s statements above also address scheduling policies, or heuristics. For example, to minimise both WIP and the idle time on the collators, two presses produce multiple-part jobs while the third processes single-part jobs.82 Unlike OR methods Neil considers operator performance. The following statements show the importance he places on keeping all operators working. Whether they get $400 a week or $4000 a week you still have to have them working. If you’ve got troughs, it is hard to keep even. I’m having a fluctuating problem of work flow at the moment — attitudes more than anything. Guys are critical. When work quietens down they all stand around and chew the fat or their machines slow down and when work comes in, and heaps of it, machines don’t go automatically back to what they should be run at, because you get into a slow routine and its hard to snap out of it. The ideal situation is that you have everyone working at a reasonable level. There is no point in having one guy busting his guts and other standing around twiddling his thumbs. Regardless of how much people earn, its good for morale, attitudes and everything else to see he has some work and I’ve some work [referring to two persons comparing themselves].

80

He stated above that he did not want a backlog as “(I) want to invoice as much work as possible”. Elsewhere he states that it was critical to invoice all jobs by the end of the month. 81

This goal was not explicitly stated but was elicited from his discussion on invoicing at the end of the month. 82

Neil made this statement before the change to the configuration of the presses. Previously, Akira 4 could only convert continuous paper into sheets and therefore was not available for printing.

167

As operators supervise and control machines, their performance is linked to machine performance. In the previous set of statements, the primary goal was maximise utilisation. The goal here is to maximise the processing speed. The print quality sets the upper bound on processing speed. The critical significance of this goal in overall decision making is shown in Neil’s statement: If a decision has to be made when producing and delivering a multi-part set and keeping everybody continually working, than produce a single-part set and have blokes standing around scratching their nuts.

The primary goal is maximise productivity. A subgoal is to keep the operators working. You’ve got to look at a balance. It is no good having 20 pallets of work sitting down there. I’ve got a fair bit of multi-part working sitting on the floor at the moment to be collated up. Basically a lot of it is for the Minami, so therefore I now look at the jobs and say I’ve got the Minami covered.

These utterances give some indication that Neil’s goals are far more elaborate than sparse OR representations. At first sight, the number of goals he seeks tends to overwhelm an observer. There is, however, a structural relationship between them. For the above cases, Neil focuses on goals that he can meet by instigating operational steps under his control. As he manipulates the schedule to move it closer to meeting these goals, he produces a schedule that also becomes closer to satisfying more goals that are primary. For example, Neil may manipulate the schedule to bring it closer to satisfying the subgoals: minimise press idle time, minimise press set-up time, minimise collator idle time and minimise collator set-up time. This activity moves the schedule closer to the primary goal, maximise utilisation. Likewise, the goal, ‘maximise utilisation,’ is a subgoal of a more abstract goal, ‘maximise productivity.’ The structural relationship between goals is shown in Figure 44 (the labels are defined in Table 10). Directed arcs into a goal indicate that it will tend towards be satisfied if there a tendency towards satisfying the goals connected by the arcs. Goals A were easily extracted from Neil’s utterances, as they are the operational objectives that guide he directly applies. Goals B are at a higher level of abstraction than the operational objectives. They are also subjects of Neil’s attention: however, the extent of the focus varies between goals. A comparison of two goals shows this. Satisfying full utilisation of all machines is directly 168

met by the underlying goals. To move the schedule closer to the goal, maximise processing speed, Neil seeks the minimisation of operator idle time. However, this link between these goals is more indirect than the previous case. Neil perceives that operator performance degrades when there is insufficient work for them to have to exert themselves to meet production demands, as the following extract from the previously quoted text shows: I’m having a fluctuating problem of work flow at the moment — attitudes more than anything. Guys are critical. When work quietens down they all stand around and chew the fat or their machines slow down and when work comes in, and heaps of it, machines don’t go automatically back to what they should be run at, because you get into a slow routine and its hard to snap out of it.

Two factors affect the processing speed on a press: the number of impressions and machine speed. The number of impressions can be decreased, if forms can be printed sideby-side.83 The quality level that the customer expects (goal 9B) sets the upper bound on speed. This constraint is shown as a directed arc formed by a dotted line and an open arrow. Goals C and D are at even higher levels of abstraction. Although Neil made no direct reference to these goals, they can be inferred from the other goals. At the highest level of abstraction is the raison d’être of the company. It is expressed in Figure 44 as the maximisation the long-term financial return, which can be plausibly inferred from the underlying goals.84

83

Running jobs side-by-side decreases the number of impressions by half and therefore processing time is halved. 84

At no time during the field study did Neil or the company directors state the primary goal of scheduling and its relation to the company’s ultimate goal or mission. Although the actual primary goal may be different (e.g., “maximise long-term viability”) its import would be similar and hence the underlying scheduling subgoals will be the same.

169

Maximise long-term financial return 1D

2D

1C

2C

1B

1A

2A

3A

2B

4A

5A

3B

4B

6A

7A

3C

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

Figure 44. The scheduling goal structure.

170

14A

15A

Table 10. Scheduling goals. GOALS A

GOALS B

GOALS C

GOALS D

1A

1B

1C

1D

2A 3A 4A 5A 6A 7A 8A

9A

10A

11A

12A 13A

14A

15A

Low press idle time Low press setup time Low collator idle time Low collator set-up time Low cutter idle time Low operator idle time Low WIP Invoice all jobs in the same month that paper is purchased Little change to press set-up between discounted jobs Complete all jobs for a customer concurrently Configure presses for premium jobs Low tardiness Particular jobs meet their due date. Match operator ability to work task Give priority to favoured customer

2B

3B 4B

5B

6B

7B

8B

9B

Fully utilise all machines Maximise the processing speed of each machine. Fully utilise all operators Maximise jobs invoiced in the month Maximise return on discounted jobs Minimise delivery costs by minimising delivery trips No waiting time for premium jobs All jobs delivered on their due date The quality of all jobs is to the standard the customer expects

2C 3C

Maximise productivity Maximise cash flow. Maximise satisfied customers

2D

Maximise short-term financial viability Maximise repeat custom

Before a job can be processed, Neil has to organise the manufacture of its plates and the purchase of the required paper. He also has to ensure that there are sufficient operators available to operate the machines that he intends to schedule. These are goal-directed activities, in which the goals are the requisite state of the system for printing to proceed. 171

Before Neil considers other goals, he has to be sure that these goals (pregoals) have been met.85 Their realisation may take a significant time, for example, the time for the acquisition of plates is two days on average (see Table 9). Neil only allocates jobs to machines that meet the following conditions: 1. The job’s plates are expected to be available by the planned time for its first operation; 2. The paper required for the job is expected to be available by the planned time for its first operation; 3. Operators are expected to be available to supervise machines for the times that the job has been allocated to them. Machines known to be out of service, due to breakdown, are not scheduled. Neil may anticipate that machines that are not available at the beginning of the scheduling period may become available within the planning horizon. Availability is therefore a constraint on planning and not a goal of planning. If an unavailable machine does not become available at the anticipated time, then processing will not go to plan. The tags for jobs that do not meet these conditions remain under the plates section of the board; nevertheless, Neil does not ignore them. He considers their relation to other jobs, either allocated to presses or awaiting allocation. In planning the sequence of jobs at each machine, Neil seeks to meet goals by following a scheduling criterion. If he wants to minimise the idle time of a press, for instance, the action plan he follows is “ensure that a job is available at the press when the current jobs is complete.” In carrying out this plan, the scheduling policy, or heuristic, that he implements is “the placement of a job in the queue that meets the technical constraints of the press.” The realisation of this policy requires a number of operational steps. Scheduling activities that bring the schedule closer to the goals A shown in Figure 44 and Table 10 are tabulated in Table 11, which shows for each goal: 1. The scheduler’s immediate objective that brings the schedule closer to the goal;

85

Note that these pregoals are in the form of expectations. If the expected result does not occur by the scheduled time then the schedule has to be repaired.

172

2. The policy in selecting a job, or string of jobs, to place in the queue for a specific machine that is directed towards the objective; 3. The operational steps to realise the policy. In the allocation of a job to a press, the operational steps described in Table 11 only pertain to jobs that meet the technical constraints of the specified machine. For example, in selecting a job for the single-colour press, Akira 4, only single-colour jobs are considered. The operational steps presented in the table are steps that Neil may conceivably make if he sought each goal separately.86 Delineating the operational steps for each goal from Neil’s behaviour is problematic, as he wants to satisfy various goals concurrently. Hence, his scheduling policy will be one that satisfices goals that may be competing. For example, to minimise collator idle time (3A) and WIP (7A), his action plan is “balance the mix of single- and multiple-part jobs across the presses such that the collators and cutters are never idle and the waiting time for the secondary operation is minimal.” The scheduling policy he follows is “produce multiple-part jobs on two presses while the third processes single-part jobs” (see p.165). This policy is much easier to follow than the individual policies. If, in following the policy, too much WIP accumulates or machines become idle, the balance can easily be modified.

86

They have been deduced from observation and discussion. While it is argued that Neil’s actions may at cursory level fit the operational steps shown, the extraction process lacked rigour. The aim was to show that the multiplicity of factors that Neil considers makes 173 scheduling a “perplex” problem.

Objective

Job is available at the press when

the current job is complete.

Maximum time between changes to

the press set-up

Job available at a collator when

current job is complete.

Maximum time between changes to

collator

Goal

1A. Low press idle

time

2A. Low press set-

up time

3A. Low collator

idle time

4A. Low collator

set-up time

174

change to the collator set up

Place job in the queue that requires no

3.

2.

1.

3.

printing will finish by time that the

1.

If it can be positioned in the queue so that there is no change to the collator’s

in the collator queue.

If it requires collation then estimate the time it would become available to place

Use other goals to select a job to place in printing queue.

becomes idle.

another job that requires this collator and will finish printing before the collator

If the job is expected to finish printing after a collator becomes idle then select

Scan each collator queue and estimate time when it would become idle.

Use other goals to select a job to place in printing queue.

operational steps.

A complete chart is not shown: “Off-page” connectors 1 and 2 lead to additional

2.

collator becomes available.

goal(s).

Select one of these jobs; preferably, one that will help to satisfy some other

press.

Scan job tags under Plates section to find jobs that can be processed on the

As the operational steps are intricate, they are expressed as a flow chart in Figure 45.

2.

1.

Operational Steps

collator in a press queue such that

Place multiple-part job that requires the

set-up as its predecessor.

Place job in the queue that requires the

Place a job in the queue

Scheduling Policy

Table 11. Realisation of goals.

to wait a long time in the queue for its

Reduce waiting time at the collators

and cutter

7A. Low WIP

next operation. 3.

2.

1.

3.

operation finishes by time that the cutter becomes available.

2.

a press or a collator queue such that the

1.

use other goals to select another job.

then, if the waiting time in the collator or cutter queue is considered excessive,

If the job is expected to finish printing before a collator or cutter becomes idle

idle.

Scan each collator and cutter queue and estimate time when it would become

Use other goals to select a job to place in printing queue.

the queue; else reject job and go to 1.87

time at the collator) by the time the cutter becomes idle then the job is placed in

complete collation (which requires estimation of processing time and waiting

the selected job requires the cutter as the third operation, then if the job will

processed by the time the cutter becomes idle, then place it on the queue; else if

If selected job requires the cutter as the second operation and it will be

Scan the cutter queue and estimate time when it would become idle.

Use other goals to select a job to place in printing queue.

configuration then place the job in press queue, else go to 1.

Operational Steps

175

From the statistical profile of resource usage shown in Figure 41, there were ten times more jobs coming directly from printing to the cutter than those coming from collating.

87

Place job in the queue that will not have

when current job is complete.

time

Place job that requires the cutter in either

Job available at the Bowe cutter

5A. Low cutter idle

Scheduling Policy

Objective

Goal

customer into one transfer batch.

press set-up

for cylinder-size change

presses for premium

jobs

Job placed on press without waiting

11A. Configure

jobs

between discounted

Group discounted jobs for a

9A. Little change to

three

176

requirement that across the presses the

Place job in queue that meets the

board

the Plates section of the machine loading

Form a string of discounted jobs under

Scan available jobs to see whether there are any jobs with reserved paper that if

month, then place originally selected job in queue.

If there are no jobs having reserved paper that can be delivered by the end of the

that meets the delivery requirements.

the text on p.178.

See the discussion on the configuration of cylinders across the shop in the body of

See the discussion on setting up strings in the body of the text on p.179.

4.

3.

2.

Use other goals to select a job to place in printing queue.

If so, then in place of the selected job, place the most critical job in the queue

reserved.

1.

purchased

reserved paper was purchased.

in the same month

Place job in queue that has specific paper

Operational Steps

placed after selected job would be delivered in the next month.

Job delivered in same month that

8A. Invoice all jobs

Scheduling Policy

that paper is

Objective

Goal

Scan jobs waiting at Plates section of Machine Loading Board.

Is there a set A of jobs that meets the technical constraints of the press?

No

Leave the press idle.

Yes

Is there a subset B of A that requires the same cylinder?

No

Either change cylinder to that required by an available job, preferably to meet goal 11A, or leave the press idle.

Yes

Is there a subset C of B that requires the same reel?

No

Yes

Is there a subset D of C (B) that requires same width?

Yes

Is there a subset E of D that requires available colours?

Yes

No

Is there a subset F of D that requires minimal wash up?

No

Is there a subset G of C (B) that requires a lesser width?

No

Yes

Yes

Yes

No

Is there a subset H of G that requires available colours?

Is there a subset I of C (B) that requires greater width?

No

Either change cylinder to that required by an available job, preferably to meet goal 11A, or leave the press idle

Yes

Yes

Select a job from the subset according to other goals.

Is there a subset J of I that requires available colours?

No

Is there a subset E of D that requires one wash up?

Yes

1

Figure 45. Operational steps to realise goal 2A, “The minimisation of press set-up time.”

177

No

2

The operational steps to realise goal 2A (Figure 45) encompass subgoals shown in Table 12. If on scanning the waiting jobs for a job, or string of jobs, to place on a particular press, Neil finds a set of jobs that require the current cylinder size, then he may select jobs that meet goal 2AS1. Next, he would first look for jobs that require the same reel of paper as the previous job (goal 2AS2). If there were none, he would then consider all jobs that meet goal 2AS1. If amongst these jobs there is a subset that has the same width as the previous job processed by the press, then their selection would meet goal 2AS3. From the jobs that meet this condition, a subset that maximises the time between washing the colour applicators (goal 2AS4) would have preference. Table 12. Subgoals of goal 2A, “Minimisation of press set-up time.”

Subgoal

Objective

2AS1 2AS2 2AS3 2AS4

Long time between changes to cylinders. Long time between reel changes Long time before changing the cross perforator Long time before washing the colour applicators

The meeting of goal 7B, “no waiting time for premium jobs,” is important, as provision of quick turnaround is the company’s Table 13. The cylinder strategic niche in the market. For any premium requirement for jobs in the database. job that arrives, there must be a press already set up with appropriately sized cylinders. Cylinder Size count percent Otherwise, the job would have to wait about 40 11²/3 (297 mm) 176 16% minutes while the operator changes the 11 (279 mm) 450 40% cylinders. Neil ensures that he meets this 17 (432 mm) 337 30% condition by including goal 11A, “configure 8% 14 ²/3 (372 mm) 92 presses for premium jobs,” as part of his normal 13 5% (330 mm) 58 action plan. To satisfy this goal, he arranges the 1113 shop such that a press is immediately available for the three most-popular cylinder sizes (Table 13). The obvious strategy is to set the six-colour press to the cylinder size that has the greatest demand, as there would be no constraint on the number of colours. Furthermore, the range of colours available in the applicators would more likely meet the requirements than the other 178

presses.88 This may also decrease the time to wash the applicators. The obligation to process jobs that customers have already ordered, efficiently and on time, tempers the scheduler’s desire to provide quick turnaround for jobs that may never be. Goal 7B only becomes active when a premium job becomes available. The slack time on arrival for premium jobs is usually only a few hours.89 On receiving the order for a premium job, Neil estimates the latest start time for each operation. If necessary, he will pre-empt a running job. If a job is to be pre-empted, then Neil prefers to wait until the operator stops the press during the process. This occurs when there is a change to another part on a multiple-part job, a pass is complete on a multiple-pass job, worn plates have to be replaced or the paper reel is changed. On restarting the press, there is an appreciable lapse of time before attaining the normal operating speed. To meet goal 9A, “Little change to press set-up between discounted jobs,” jobs for a customer that have been costed at a group rate — per thousand instead of per job — are grouped into strings. Neil treats jobs in a string as a single transfer batch. Their processing, which is consecutive, is such that there is minimal change to the press configuration. A basic requirement is that jobs in a string all have the same cylinder size. For discounted jobs, Neil forms strings in the Plates section. He arranges jobs in a string in the order of processing that minimises time lost in minor set-ups. As jobs are sent to plate manufacture, he places their tags in the appropriate string. String building activity precedes allocation to a press.

88

Of the 106 jobs processed on the Akira presses for which the number of colours were recorded, 42% required 1 colour, 42% required 2 colours, 10% required 3 colours, 4% required 4 colours and 1% required 6 colours. 89

Slack time on arrival shows how much time there is before the processing of a job has to start for it to finish by its due time ( Slack time on arrival = d i − pi − ri = ai − pi where ri is the arrival time, di is the due time, pi is the processing time and ai = d i − ri is the total allowance for time in the shop).

179

The policies and operational steps outlined in Table 11 describe the allocation of a single job to the next place in the queue. Each goal is considered independently. While Neil often allocates jobs one at a time, he does not do it in a myopic way. He may consider the effect of the allocation on various goals at a time. The table does not describe how he balances competing goals. While the relative priorities of goals are not enumerated, Neil has a subjective ordering of some goals. His primary goal is “to produce as many jobs as possible in the least time and as cheaply as possible.” The emphasis is on the throughput of jobs and the monetary cost. Throughput and cost pertain to utilisation; during discussions Neil repetitiously declared that the machines must be kept working. To maximise utilisation, he tries to minimise the set-up time and machine idleness (goals 2A and 1A). He perceives these goals to be more important than trying to minimise tardiness (goal 12A). For him, it is acceptable to follow a strategy that meets his primary goal even if the consequence is that most jobs are a day or so late. That is, he is willing to forgo goal 8B, “all jobs delivered by their due date.” He judges that the risk in not “maximising customer satisfaction” is sufficiently low that the pre-eminent goal “maximisation of repeat custom” is not jeopardised. If, however, the customer insists that the job be delivered by its due date (goal 13A), then the strategy is adjusted to try to eliminate the violation of this constraint. He also looks beyond the immediate object, the allocation of a job to a specific press, to see how an allocation may affect the shop as a whole. Perhaps it would be better to allocate the jobs he is considering to other machines. Alternatively, it may be better to move some jobs currently allocated elsewhere to the machine at the focus of his attention. For example, the operational steps for goal 2A, shown as a flow chart in Figure 45, only consider the immediate object, the allocation of a job to the specific press. In practice when deciding whether to select a job that requires colours already available on the press, Neil considers the effect of his choice on other presses. In selecting a three-colour job for the six-colour press, Akira 3, time would not be lost to cleaning if inks of the required colours are already loaded in the applicators or there are sufficient unused applicators. However, if Neil leaves this three-colour job for the four-colour press, then the six-colour machine would be available for jobs requiring five or six colours. His choice depends, inter alia, on the current sequences on both machines, the possible sequences on each, the set-up costs on each when the job is loaded and the total set-up cost for each over a selected period. The flow chart

180

does not show how Neil makes a choice between jobs that meet the conditions of any particular decision. If more than a single job requires the prevailing set-up — meeting the conditions that form a column on the far left of the flow chart — the goal 12A, “low tardiness,” becomes dominant. The “minimisation of set up” makes available more time for production, and thereby helps to reduce tardiness for a given set of jobs. In addition, the order that jobs are processed within a string — requiring the same set up — also affects tardiness. While from an OR perspective SPT would yield the most jobs over a given time, Neil stated explicitly that “the priority is what takes the longest time.” This implies the use of LPT; a common practice of schedulers. As large jobs are sold at a higher price than small ones, Neil places a high priority on meeting their due dates. His focus is on establishing good relationships with customers whom he anticipates will place large orders in future. That is, for goals 3C, maximisation of satisfied customers, and 2D, maximisation of repeat custom, the weighting he applies for these customers favours them. There are other dimensions not shown in the scheduling goal structure in Figure 44. To meet the pregoal, “operators are available to supervise machines,” Neil aims to minimise operator absenteeism. Neil therefore considers the operators’ reactions when setting work, as discontent may lead to absenteeism. You have … blokes like Llewy (who is) not a very reliable printer as far as quality goes. He is good at the collators. If you give him a day on the collators, he won’t come in the next day.

To minimise set-up time Neil endeavours to meet due dates by striving to reduce tardiness in one of its forms. In choosing a schedule, he deliberates upon other aspects of the job and the environment. His goal is composite. It includes, but is not restricted to, the maximisation of machine utilisation and the minimisation of tardiness. Of course, these can be expressed as a weighted linear function. However, this is not his practice. In minimising set-up times, he critically appraises the practical merit of each notional reduction in unproductive time. The processing time for most jobs is between 20 minutes and four hours. For small jobs, all these factors contribute significantly to lost time. However, for large jobs, time lost in washing applicators and changing perforating tools is only minor proportionate to the total time for an operation.

181

Consequently, to meet other objectives, Neil may be willing to tolerate avoidable changes to the set up of the machine. The relative importance of each goal may vary with the time of the year, month, or week, the shift, the value of the current jobs, customer goodwill, and the practice of competitors. Neil also has to consider other, less tangible, goals, for example, satisfying customers. Satisfaction is difficult to express. It does not have a singular meaning. For some customers the meeting of the date agreed upon is most important. For others, the primary interest is for the turnaround to be fast, while other customers are concerned about the quality of the job. At Melamed, the highest quality is only achievable on the six-colour machine running at less than normal speed. These restrictions contravene the conditions for maximising machine utilisation and reducing average turnaround time. The interpretation of some factors depends upon context. For some customers the due date is rigid. For others, completing the jobs a day or two late may be immaterial: whether Neil renegotiates the due date depends upon the relationship with the customer. Yet again, these very same customers may have jobs in the system with due dates that are atypically firm. The attitude towards the customer also influences the significance placed on a due date. The disposition of the scheduler, manager, or sales representative towards a customer who is regular, new, slow to pay, belligerent when jobs are late, etc., may affect which delivery date Melamed’s consider acceptable. Different departments or persons may see particular constraints quite differently. A sales representative, a production supervisor and a customer may hold quite different views on the firmness of a due date. While a customer may not be unduly concerned about a late delivery, a sales representative may see a late delivery as a threat to his or her reputation. The value placed on any particular factor is an outcome of the interplay between interested persons and groups.90

90

Each of these situations either arose during the observed periods or were described by Neil.

182

There may be contextual factors relating to the working environment. For example, in scanning available jobs, the customer’s name sometimes signifies that the customer expects exceptional quality. To achieve such quality, the scheduler may need to allocate the job to a particular machine with an especially good operator. As operators find it stressful to produce work of very high quality, Neil tries not to overload an operator with exacting work. He therefore tries to mix lower quality with high quality work. While AI approaches can, to some extent, deal with context, they suffer from an inability to decipher qualitative differences in meaning without having them being explicitly stated (Papantonopoulos, 1990). This requires teasing out all possible contexts and associated meanings for circumstances that may have not yet arisen, and, then the formulation of appropriate rules to place in a knowledge base. Customers may be tolerant in accepting less urgent jobs being tardy if it helps to establish a relationship that allows them to be exceptionally favoured in having urgent jobs being turned around very quickly. Where there are competing goals a relationship between them may be expressed as a weighted linear function; however, this is not Neil’s practice. There are other goals that are subsumed by the goals shown in Figure 44. A primary concern is to maximise the productivity of the operators. As the operators tend machines, performance measures based on the goals associated with the machines are indirect measures of operator productivity. For example, the time taken to change the configuration of a machine (e.g. 2A) depends upon the operator. The minimisation of press idle time, goal 1A, in part depends upon the operator. Neil finds that an operator can become irritable if Neil asks him to change the cylinders on a press when they have only been recently changed, especially when while there are other jobs available requiring the current configuration. While the schedule, is the plan of work, the flow of work on the shopfloor is affected by everyday vagaries. Therefore, finding more work when a machine becomes idle depends upon the operator’s initiative: You’ve got someone super-efficient and committed like Rudi that will look for work. You’ve got other operators that if there is no work … or if Rudi is not there or if I don’t go down there they won’t find more work.

183

To try to maintain operator productivity, Neil tries to keep all operators occupied thereby hoping to develop appropriate work attitude and to maintain morale. His perception is that morale reduces if some operators are working hard while others are not working. Therefore, if there is a shortage of work to keep all the operators occupied, then Neil gives higher priority to jobs that require finishing operations that will absorb operator idle time. He also attends to operator idiosyncrasies. For example, in operating collators Llewy performs well; however, he is below standard as a press operator.91 Nevertheless, if Neil places him on the collators for a whole shift then he will not come to work the next day. To minimise processing time (goal 2B) Neil tries to decrease the number of impressions in a job by printing forms side-by-side. As a rule-of-thumb Neil places two forms across the width of a sheet (i.e., run side by side), when it is technically feasible, if the number of forms is not less than 45000.92 In Figure 41, a single job seems to have been processed first on the Trident and then on the Bowe cutter. It used a non-standard means for achieving a production goal. The job consists of two million forms that made up of separate sheets. Its path was not exactly as shown. Production started on a Trident press, but was taking too long and would not meet the due date. It was transferred to an Akira press, which was faster than Trident, and was produced as continuous form. In effect, the job was split into two transfer batches. Although it required an extra operation, cutting into sheets on the Bowe, the flowtime was reduced.

91

Whether the poor performance is associated with quality or quantity was not elicited. 92

The saving on manufacturing time is not large enough below 45000 impressions to warrant the extra work required for reconfiguring the machine. Side-by-side takes half an hour for set-up and another half an hour to get it running at its optimum. Assuming a depth of form of 279mm, the calculated manufacturing time would reduce from 85 minutes to 65 minutes if forms were run side-by-side. For 15000 impressions, the time for both configurations is 40 minutes.

184

5.5

Intensive Study of Scheduling

Neil’s scheduling behaviour was observed on three weekdays from the 28th May to the 1st June 1992. This covered normal production days on Thursday, Friday and Monday and limited production over the weekend. Each step he made in constructing schedules on the machine loading board for the web presses was recorded. As rational-action underpins scheduling activity (see Chapter 2), each step of the scheduling process was assumed to be purposeful. Often it was clear from the activity which goals were relevant. In other cases, Neil was asked the purpose of his actions. For the four presses, the key goals associated with each step are shown in diagrams Figure 47 to Figure 50. These goals were central to the decision making. Due to organisational restrictions in interrupting Neil’s activities for debriefing, the observers were unable to elicit those goals for which the effect on the consequential action of the decision as minor. In particular, goals associated with subsequent operations were unable to be discerned. In the diagrams, the goals at each step are spatially encoded to an extended goal structure, which includes the subgoals of goal 2A as shown in Figure 46. The grid of rectangles placed over the goal structure maps goals to locations in the graphical display for each step. Dots signify goals that Neil attended while stars signify goals that he manifestly violated. For example, consider scheduling Akira 1 shown in Figure 47. The actions at step 4 support goals 20A and 7B while violating goal 2AS3. Where a tag is transferred back to the plates section or to another machine an arrow is shown directed towards the right with an appropriate label attached. Similarly, arrows directed towards the left signify tags transferred from other presses. The lack of an arrow indicates that the tag came from the plates section.

185

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

Figure 46. The goal structure including the subgoals of 2A.

An action may simultaneously support and violate the same goal. For example, at step 15 in the scheduling of Akira 2 (Figure 48) Neil placed two tags in the queue following a tag for a smaller-width job. Therefore, goal 2AS3 “long time before changing the cross perforator” is violated. However, within the two-tag string the widths decrease. Therefore, the order within the string supports goal 2AS3. Most goals recorded were at the bottom of the goal-structure hierarchy as these relate most directly to the immediate action. By meeting the lower-level goals, the higherlevel goals come closer to being satisfied. For example, if all goals from 2AS1 to 2AS4 are met then no time is wasted in setting up the press. Accordingly, goal 2A “low press set-up time” is also met. This in turn contributes to goal 1B, “Fully utilise all machines.” In spite of the link to higher level goals, for most cases they are not shown on the record, as it cannot be confirmed that Neil directly attended them. For a few cases, attention to both low-level goals and high-level goals was unequivocal. A good example is step 13 in the scheduling of Akira 2. Neil introduced a gap in the list of tags to remind him to place tags for two plain-paper jobs (i.e., conversions) when they become available. As the machine would be set already to a width of 241mm, the application of ink could be stopped and then the jobs run using the same paper as the previous two jobs. This eliminates a change of reel (goal

186

2AS2). This depends upon the cylinder size not changing and hence goal 2AS1 being satisfied. As the width does not change, goal 2AS3 is satisfied. Neil also directly referred to the higher goal: “So a job that may be costed out as an hour may take a quarter of an hour. Thus, money can be really be made if a job can be tacked onto another.” In contemplating the “tacking on” of jobs, Neil focuses on reducing set-up as a whole. The reference to “making money” is a focus on a financial goal. Within the goal structure, the relevant goal is 2C, “maximise cash flow.” Moreover, he stated that he “keeps jobs on purpose a day or two late, so that he can group them together.” That is, to satisfy these goals he is ready to violate goal 13A, “meet a job’s due date.” Furthermore he declared, “While you have to give the customer good service, but they also want a low price, so the aim is to keep the costs to a minimum.” Hence, he also deliberates on the balance between goal 2C and goal 3C, “maximise satisfied customers.” However, goal violations do not pass through the hierarchy, as violation of low level goals does not imply violation of higher goals. For example, a change to the cylinder size causes a violation of 2AS1 for the immediate step. However, the change may begin a string of jobs for the same cylinder size. Thereupon goal 2A is satisfied. At step six on Akira 2, the violation of 2AS1 initiates a series of scheduling steps that rely upon the new cylinder size to meet the same goal.

187

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

1 2 3 4 A2

5 6 7 8 9 10 11 12

Figure 47. The relevant goals for each step in the scheduling process for Akira 1.

188

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

1 2 3 4 5 6 Plate

7 8

Plate

9 10

A1

11 12 13 14 15 16 17 18

A3

19

A1

20 21 22 23 24 25 26 27

Figure 48. The relevant goals for each step in the scheduling process for Akira 2.

189

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

1 2 3 4 5 6 7 8 9 10 11 A2

12

A2

13 14

Figure 49. The relevant goals for each step in the scheduling process for Akira 3.

190

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

1 2 3 4 5 6 7 Plate

8 9 10 11 12 13

Figure 50. The relevant goals for each step in the scheduling process for Akira 4.

For the 66 scheduling steps observed in the scheduling of 62 jobs during the field study, there were 12, 27, 14 and 13 steps on Akira 1, 2, 3 and 4, respectively. There were twice as many steps on Akira 2 as on the other presses. This was partly due to the need to repair the schedule when a partial breakdown encumbered the press. Jobs that could be processed under the restricted conditions had to replace the original jobs (this case is discussed in detail on page 199 in Section 5.5.1). Moreover, the expected utilisation of Akira 2 was nearly twice that of the other presses, as it was the only press that Neil had planned to operate during the weekend (see Figure 51 to Figure 56). Neil scheduled its operation to meet a goal not shown in the goal structure. To meet the due-date requirements of a large job, it was necessary for it to 191

be collated during the weekend. However, to meet safety requirements, the operator of the collator had to have someone accompany him. Akira 4 was the least utilised. Because its capability was the lowest, Neil tended not to use it when there was a shortage of operators. Table 14. The percentage of steps for which a goal applies: per presses and per all presses.

Akira 1 1A 2A 2AS1 2AS2 2AS3 2AS4 8A 10A 11A 12A 13A 14A 15A 1B 7B 2C 3C

8 75 25 25 8 42 8 25 17 17 8

Akira 2 15 7 63 11 48 44 4 7 15 26 7 7 11 4 15 4 7

Akira 3

Akira 4

79

54

43 50

46 8

14 36 7

29

23 15 31

8

All presses 6 5 67 5 42 35 3 15 14 23 14 3 8 2 15 2 3

Table 14 shows for each goal the percentage of steps that it applies for each press. The total of percentages for each press is greater than 100 since there was often more than a single goal per step.93 From the distribution of goals shown, as expected,

93

For each goal, the percentage of steps across all presses for which it applies cannot be found by summing the percentages across the presses, as the number of steps differed between presses.

192

goals associated with setting up the presses (i.e., 2AS1, 2AS3 and 2AS4) dominated.94 Over the limited period of the study, Neil referred to most goals at the lowest level of the goal hierarchy. Omitted were goals 3A, “low collator idle time”, 4A, “low collator set-up time,” 5A, “low cutter idle time,” 6A “low operator idle time,” 7A, “low WIP” or 9A “little change to press set-up between discounted jobs.” Goal 6A was not relevant, as there was such a shortage of operators that some machines had to be left idle.95 During the period, there were no discounted jobs, so goal 9A was not relevant. Goals 3A, 4A, 5A and 7A pertain to scheduling the presses to minimise criteria associated with subsequent operations. However, for the investigation to be manageable, the observers restricted their purview to the web presses and consequently they ignored these goals. Table 15. The percentage of steps for which a goal is violated: per presses and per all presses.

Akira 1 2A 2AS1 2AS3 2AS4 15A

8 8 25

Akira 2

19 15 11 4

Akira 3

29 29 14

Akira 4

23 8 38

All presses 2 20 18 15 2

Goal 12A, “low tardiness,” was the next dominant goal after the goals associated with set-up. Goal 10A, “complete all jobs for a customer concurrently,” varied across machines and its dominance depended upon the presence of jobs that fitted the

94

Reel changes may have been more dominant than shown, as the observers relied on Neil to tell them when a different reel of paper is required. In contrast, a colour change in the list of jobs alerted the observers to a change in cylinder size. 95

This was particularly evident on Friday when most machines stopped operating early afternoon as there were insufficient operators to supervise them all (see Figure 52).

193

category. Neil seems to have emphasised goal 13A, “particular jobs meet their due date,” in scheduling jobs on the one-colour press, Akira 4 (see Table 16).96 On the other presses about a third to a half of the jobs were scheduled to meet due date. Most of the jobs that had no due date were premium jobs that had to be completed within a day. The table shows that the two- and four-colour presses, Akira 2 and 3, were used process the premium jobs. Table 16. The status of the jobs relative to their due date for the day that the schedule was constructed (Percentages are for the day).

Akira 1

Akira 2

Late No due date

5 (63%) 2 (25%)

2 (25%) 4 (50%)

1 (10%) 0

Due day Early

0 1 (13%)

0 2 (25%)

7 (70%) 2 (20%)

Friday

Late No due date Due day Early

4 (57%) 1 (14%) 0 2 (29%)

3 (30%) 3 (30%) 1 (10%) 3 (30%)

4 (36%) 1 (9%) 1 (9%) 5 (45%)

0 0 1 (33%) 2 (66%)

Monday

Late No due date Due day Early

2 (25%) 4 (50%) 0 2 (25%)

6 (60%) 1 (10%) 2 (20%) 2 (20%)

4 (67%) 0 2 (33%) 0

3 (100%) 0 0 0

Thursday

Akira 3

Akira 4

For premium jobs, Neil met the goal 7B, “no waiting time for premium jobs,” by ensuring that goal 11A, “configure presses for premium jobs,” was satisfied when allocating other jobs. As the due date field is generally ignored when a tag is written in the day, tags without a due date were assumed to be premium. The premium jobs were allocated to the two- and four-colour presses, Akira 1 and 2 (see Table 16), as they had the greater proportion of jobs without a due date.

96

For each day all jobs in that day’s schedule are listed. Therefore, jobs are counted more than once if they are in the schedule for more than a day. However, their status may change across days: a job that is on time one day is late the next.

194

The violations of goals were mainly associated with set-up (goal 2A and its subgoals): a necessary condition for changeover. The other goals violated were 15A and 13A “particular jobs meet their due date.” In considering the goals associated with each step, violations of goal 13A were not included. Because a significant proportion of jobs was already late by the time they were scheduled, this goal was often violated even if the scheduler did not desire it. Contributing to the excessive number of late jobs was a very large job that ran for more than a week and finished at the end of the first day of the study. Effectively it had taken out of service the most versatile press for processing any other job.

5.5.1 Comparing the Schedule with the Production Record The Gantt charts, Figure 51 to Figure 56, show the production history.97 Machines are listed on the left in abbreviated form. A1 to A4 refer to the Akira 1 to 4, respectively. The other machines are the Hunkeler, Sanden and Minami collators (H, S and M, respectively), the Bowe cutter (B) other (O) non-specific resources, general (G) work and the Trident press (T1). The suffix on the job numbers shows the cylinder size using the code in Table 17.

97

Table 17. Code for Akira-press cylinder size

Cylinder Size

Code

11²/3 inch (297 mm) 11 inch (279 mm) 17 inch (432 mm) 14 ²/3 inch (372 mm) 13 inch (330 mm)

Using the terminology described in Chapter 3, they show the time slots for instances of the class OperationMachine that form from the entity-relationship IS_PROCESSED_BY between the entities, job and machine.

195

A B C D AA

Operator not available — supervising 6

7

8

16631D

9

10

11

12

13

16537D

14

15

16779B

16763B

16690B

L

16

17

18

19

20

21

22

23

24

3

4

16765B

16789B

6

16817B

I

L

A4

5

16800C

Operator from A4

16637B 16640B

16786C

2

16796C

A3 16788C

1

16797C

L

A1 A2

No operator

16640B

H

16537

- no operator

S 15468

B 16787

M

16631

L

16788

- idle

I 16763

16786

L

16789

16797

L

L

- lunch

16779

O 6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

2

1

3

4

5

Figure 51. Actual production 28/5/92 (Thursday). creasing 6

A1

7

16800C

9

11

16578C

A4 I

16819C

16794A

16820C

I

13

M

16

17

18

16818 C

L L

O G

15468 8

9

10

19

20

21

22

16827A

23

24

2

1

3

4

5

16584B

16460C

M

6

16733B

16584B

16803A

M

Pre-emption: reload 31/5

L

- no operator

16640 16820

L

Pre-emption: reload 1/6 7

15

L

M

Pre-emption: reload 1/6 16789

6

14

99999 test pack

16640

B

12

16662C

16719B

A3

H S

10

16801B 16682B 16722

A2

Pre-emption: reload 30/5

Pre-emption: reload 1/6

8

M - maintenance

16722

I

16827

L

- idle - lunch

L 11

12

13

14

15

16

17

18

19

20

21

22

23

24

Figure 52 Actual production 29/5/92 (Friday)

196

1

2

3

4

5

6

6

6

7

8

Reload: pre-empt 9 10 29/5

11

12

Reload: previously on A1 13 14 15 16 17 29/5

18

19

20

16460C

A1

16733B

A2

16810B

16809B

16745B

M

21

22

23

24

2

1

3

4

5

6

Pre-emption: reload 31/5

16705A

A3 A4 16640

H S

16584

B M M - maintenance

O G 6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

2

1

3

4

5

6

Figure 53 Actual production 30/5/92 (Saturday) 6

7

8

9

10

11

12

13

14

15

16

17

18

Reload: preempt 30/5

A1

M

A2

16705A

19

20

21

22

23

24

1

2

3

4

5

6

Reload: preempt 29/5 L 16803A

A3 A4 16640

H S B

M

- maintenance

M O

L

G 6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

Figure 54 Actual production 31/5/92 (Sunday)

197

23

24

1

- lunch 2

3

4

5

6

Reload: preempt 29/5 6

7

8

9

10

Reload: preempt 29/5

11

12

16778C

A1

15

16

17

18

19

16819C

16737C

16799C

L

20

22

23

24

2

1

L

3

L

S Pre-emption 16737

M 16460

16778

L

O

16684

6

- no operator

Continue at start next shift

16682

5

Continued next shift Continue at start next shift

16640

B

4

Continue start next shift

16842B/2 16807B

16589C 16697C

A4 H

21

L 16707B16710B

16837AA

L

16840A 16838A

A3

14

L

16682B 16681B/3 16684B

A2

13

M - maintenance

Pre-emption 16799

L

- idle

I

16819

L

- lunch

G 6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

2

1

3

4

5

6

Figure 55 Actual production 1/6/92 (Monday) Currie: 16611, 16785, 16612

Continued from preceding shift 6

7

8

9

10

16777B

16807B

A2

11

13

LI

16835B

16589C

A3

12

16832C

16806C

A1

Pre-emption 16611C

L

14

Pre-emption Pre-emption 16785C 16612C 15

17

18

19

20

21

16652B 16653B 16843A

24

16707

16640 16819

L

M

16819

16704

O 6

7

8

9

10

L 11

12

16806

16776 13

14

16611

L

16777

16832

T1

5

6

M - maintenance 16589

L 16727

4

- no operator

Operator moved

B 16807

3

16761D

H

16842

2

1

continued next shift

16766B

16792C

S

23

16845A 16727B

L

16704C

16704C

22

16746C

L

16697C

A4

16

15

16

16612

- idle

I

16785

L

- lunch

16776 17

18

19

20

21

22

23

24

1

2

3

4

5

6

Figure 56 Actual production 2/6/92 (Tuesday)

Except for three instances, the operators followed the production schedule. An operator skipped a job that was very difficult to set up. An operator improved performance by interchanging two jobs for the same customer so that there would be a width reduction between jobs. He thereby increased the time between changes to the cross perforator (goal 2AS3). Two jobs for which a resource (blankets) was not available were also skipped. This indicates that the operators followed the schedule 198

and Neil was aware of contingencies, such as breakdowns, that arose on the shopfloor and amended the schedule to suit.98 The blankets were not available for the two jobs as they had been used to replace damaged blankets on Akira 3. When Neil became aware that the two jobs had been skipped, he amended the schedule. Despite wanting to expedite these jobs to meet goal 7B, as they were premium jobs, he had to put them aside. Instead, he focussed on keeping Akira 2 active (goal 1A) by scheduling jobs that did not require the unavailable blankets. Therefore, to keep the press operating, Neil repaired the schedule in steps 2 to 9 (Figure 48). However, finding jobs that the press could process under the restrictive conditions, was not a completely successful strategy; consequently, the press became idle by Friday evening (Figure 52). The next day, Neil transferred their tags to the plates section of the machine loading board as the conditions for processing the premium jobs did not seem imminent. However, within an hour, conditions had changed and on step 12, he again placed the tags in the queue. The interplay between goals is insightful. Neil ranked the expedite goal 7B for the two jobs and the idleness goal 3A for Akira 2 as less important than the goal 12A “low tardiness,” for the large job on Akira 3. That is, he violated two goals normally considered very important to minimise the tardiness of the very large, high-valued, job on Akira 3 that was already three-days late. The effects of meeting the goal for this job extended across all web presses. To keep one specific machine operating, the capability of every other machine was reduced. That is, he increased the constraints on all these machines. As blankets tore, he replaced them with blankets from the other machines. Two blankets were taken from Akira 1, without affecting its production, as it was not processing B-size jobs. In taking a blanket from Akira 4, Neil had to replan production. He brought C-size jobs to the front of the queue. In the repaired schedule, Neil expected the B-size jobs at the end of the queue would run after spare blankets became available. One blanket was taken from Akira 2, thereby

98

Friday evening was an exception. Jobs planned for Akira 2 were transferred to Akira 1 due to operational difficulties with the Akira 2. This change was not shown in the schedule, as Neil was not on duty.

199

leaving it only capable of producing three colours instead of four. When more blankets became damaged, two more blankets were removed Akira 2 and its capability reduced to a single colour. The production record shows that jobs were pre-empted, overlapped and split. Preemption occurred frequently. Most were pre-empt resume as the change occurred at the completion of a part or when worn plates needed replacement. The press operation and a secondary operation overlapped for jobs 16820 and 16737, on Friday (Figure 52) and Monday (Figure 55), respectively. On Tuesday, job 16727 was split between Akira 2 and the Trident press, and job 16819 was split between the Sanden and Minami collators (Figure 56). These practices do not fit the simple OR models but are characteristic of perplexity, as discussed in Chapter 3.

5.6

Summary of Findings from the Field Study

Neil’s scheduling behaviour was observed to broadly concur with the findings of the initial investigation. At each step he considered a range of goals and endeavoured to satisfice a selected few. The goals upon which he fixated depended upon context. Consequently, they varied over time. In deciding which goals were relevant, and his strategies for their attainment, he considered multiple attributes of jobs and machines. He did not follow the Operations Research practice of defining jobs and machines by a few attributes. Combinatorial complexity was not the problem that confronted Neil. Instead, his scheduling practice was characterised by perplexity. His goals were myriad and some were in conflict. Within the field of goals and state variables — job and resource attributes — he had to decide which goals to satisfice. The case of the large job, explicated above, is a good example. Neil observed the state of resources (i.e., the blankets) and the due-date attribute of the job, and the job’s processing time, which is an attribute of the job-machine entity relationship. He considered the requirements of other jobs and decided that the goal to minimise the tardiness of this job had primacy over all other goals.

200

CONSTRAINTS DEFINE MANUFACTURE Problem: no feasible schedule HEAVILY Problem: Too many feasible schedules

RELAX CONSTRAINTS

LIGHTLY Problem: Which to loosen

SIMPLIFY MODEL

SELECTION USING GOALS

Problem: Still more than one feasible schedule

Problem: conflicting goals

SELECT USING PERFORMANCE

EVALUATION

Figure 57 Constraints define the scheduling process Regularly, to meet his goals, he had to relax constraints. The question, which constraints to relax, was central to his decision-making. His strategy was that of light-relaxation shown by the right-hand path in Figure 57. A constraint that he frequently relaxed was the due date. The extent depended upon subjective judgment based upon context, which is evident in the above example: Neil forwent meeting due date to attain maximum utilisation. This practice supports the contention advanced in chapter 3 that schedulers do not take due dates as inviolate. Table 18. Available time for the web presses in minutes.

Date

Akira 1

Akira 2

Akira 3

Akira 4

Availability

Thursday Friday Saturday Sunday Monday Tuesday

1110 1180 180 0 990 860

690 800 510 600 990 990

1020 560 0 0 850 790

1020 490 0 0 500 410

3840 3030 690 600 3330 3050 14540

How well did his schedule perform against the standard metrics of utilisation and tardiness? The production record for the presses shows the time booked to each job. This was assumed to be approximately equal to the sum of the time to set-up the 201

press and to process the job, which will be here called the manufacturing time. The manufacturing times for the jobs processed during the intensive study are shown in Figure 58 and their frequency distribution is shown in Figure 59.99 Table 18 shows the available capacity for each machine. It was found by subtracting time lost to breakdown and lack of requisite resources, such as operators or blankets (see Figure 51 to Figure 56). The total available capacity for each day is shown in Table 18. The total manufacturing time for the 59 jobs was 12530 minutes and the total capacity for the period was 14540 minutes. Therefore, the ratio of the total manufacturing time to the total available time was 86%: that is, the presses were idle 14% of the time. 1200 Manuf actu ring Ttim e (minutes)

1000 800 600 400

58

55

52

49

46

43

40

37

34

31

28

25

22

19

16

13

7

4

1

0

10

200

“Norm alised” Jo b Num ber

Figure 58. Manufacturing times of jobs processed during the field study. 14

Freq uency

12 10 8 6 4

1200

1000

800

600

400

200

100

0

0

2

Manuf actu ring Tim e ( minut es)

Figure 59. Distribution of manufacturing times.

Of the 59 jobs scheduled during the study, 31 jobs (53%) were tardy based on the date of delivery. However, the final operation for 10 of these jobs had finished by the due date. Of the 25 jobs (42%) that had not completed their final operation by their

99

To meet the requirements of the simulation program, the original job numbers have been replaced by numbers that have been “normalised” to the order of arrival.

202

due date, 19 (32%) could not be completed by their due date even if they were processed immediately on arrival.100 Therefore, twelve jobs (20%) were tardy due to scheduling activity. For all but two of these, the final operation had finished, but not delivered, by the due date. The average tardiness for the 31 jobs was 5.6 workdays (i.e., excluding weekends). However, for the jobs that were not tardy on arrival, the average tardiness was 1.8 workdays. If the completion of the final operation is used as the criterion for meeting the due date, then the average tardiness for the 25 tardy jobs is 5.3 workdays. Applying this measure to the jobs that were not tardy on arrival, the average tardiness did not vary from the 1.8 workdays based on delivery. If secondary operations are ignored and completion of the printing operation is used as the criterion for meeting the due date, then the number of tardy remains at 25 and the average tardiness reduces to 5.1 workdays. Hence, reducing the scheduling problem to a single operation has a minor effect on average tardiness. In addition, using completion of the single operation, as the criterion for meeting the due date, changes the number of tardy jobs by 9%. Therefore, as the reduction of the scheduling problem to a single operation does not significantly affect the measures of performance.

5.7

Benchmarking Performance under Perplexity

To obtain a benchmark for judging Neil’s scheduling performance, the problem was reduced to that of a single operation that may take place on any of the four presses. Constraints on the number of colours each press could print in a single pass and the special capabilities of particular presses were removed. The shop was reduced to four parallel machines equal in capacity and capability and always available. To conform to the requirement on availability, the times when presses were unavailable in the shop were subtracted from the actual production day to give a simulated production

100

On their arrival, sixteen jobs were tardy and three jobs although not tardy had negative slack time on arrival.

203

day in which all presses were available (Figure 60, Table 19)101. All constraints relating to the job attributes, except the arrival and due dates and the number of impressions, were removed. Table 19. The length of each day for the simulation is given by the total available time for each press divided by the number of presses.

Day Length of the Length of the actual production simulated Day (minutes) production day, (minutes) 0 1

1440.00 1440.00

960 760

2 3

720.00 780.00

170 150

4 5 6 7 8

990.00 990.00 990.00 990.00 990.00

830 760 990 990 990

101

The modified length of day was observed for days 0 to 5. The lengths of the day for days beyond day 6, were used to calculate a simulated time for due dates that were beyond the period of the study. For this purpose, the length of the standard production day was adequate.

204

Parsifal, a scheduling simulation program, was used to construct various schedules for the simplified problem.102 In the simulation, when a machine becomes available, a dispatch heuristic decides which job to schedule next. The heuristic assigns a priority to each job and then selects the job with the highest priority. Parsifal constructs a schedule for each heuristic that the user has selected. It also constructs a schedule using general pairwise interchange to act as a benchmark for assessing the performance of the other heuristics.103 For the data from the intensive study, Parsifal used the following heuristics, described in Chapter 3, to construct schedules: 1. FCFS: First Come First Served, which is the same as First In First Out (FIFO); 2. SPT: Shortest Processing Time; 3. LPT: Longest Processing Time; 4. EDD: Earliest Due Date; 5. SLACK: Minimum slack remaining; 6. R&M: A due-date rule.

102

The Parsifal software accompanies Morton and Pentico’s (1993) text on heuristic scheduling systems. 103

It uses the Rachamadugu and Morton (R&M) heuristic as a starter.

205

Length of simulated production day for day r is given by:

Tmodr =

mTr − TLr m

where m

= the number of machines

Tr

= length of production day for day r = time lost due to operators being unavailable or machines broken down

TL r

for day r

The due date is given by: day j

td j

r =0

Tday j

d j = å Tmodr +

Tmodday j

where td j = time of day (in minutes) job j is completed

The actual completion time of job j converted to simulated time: day j

tc j

r =0

Tday j

C j = å Tmodr +

Tmodday

j

The actual arrival time of job j converted to simulated time: day j

rj =

åT r =0

mod r

+

tr j Tday j

Tmod day

j

where dayj

Tday j

= completion date for job j - start date of the study = length of production day on the day job j is completed

Tmodr = length of production day, modified, for day r

tc j

= time of day (in minutes) job j is completed

tr j

= time of day (in minutes) job j arrived

Figure 60. Formulae for determining length of the production day and due date in simulated time and conversion of a job’s actual completion time to simulated time.

All times (dates and processing times) were converted to minutes, as Parsifal requires integer data. Furthermore, Parsifal cannot handle times that are earlier than the start 206

of a simulation. Therefore, the arrival time of jobs already available at the start of the study was set to zero. In addition, the due time for the 16 jobs that were already tardy at the start was set to zero. Since all jobs in the simulation must have a due time, all jobs without a due date were set to 7th June, that is, four days beyond the end of the study. As the simulated production day was less than the actual day, arrival times had to be proportionally adjusted to the simulated day. The length of the simulated day was determined using the formula in Figure 60. The available capacity for each shift was used in the conversion of dates to minutes. Figure 61 shows the arrival and due times, with time measured from the start of the intensive study. In converting due dates to due times in minutes, due times were assumed to be ex factory. For jobs to be ready for delivery by the close of business, 4 p.m. was presumed to be the latest time of completion. To find the due time of a job, the time from the start of the study to the due day was first determined (Table 19). Then the time from the day’s start to 4 p.m., proportionally adjusted to the simulated day, was added. For example, the due time of jobs due on the Monday, the 4th day, was: 960+760+170+150+600(830/990)=2543 minutes. 9000 8000

Time (mins)

7000 6000 Arrival

5000

Due

4000 3000 2000 1000 0 0

5

10

15

20

25

30

35

40

45

50

55

60

65

“Normalised” Job Number

Figure 61. The arrival and due times of jobs processed during the intensive study.

The simple model did not include set-ups. Therefore, the processing time in the simulation was set to the actual manufacturing time, which includes both set up and processing. While this simplification ignores the effect of each job’s predecessor on its manufacturing time, it provides a reasonable approximation for judging overall performance. Comparison of the performance of the actual schedule at Melamed with the simulation requires the adjustment of some data. The due dates and arrival times that 207

were earlier than the start of the study were set to 28th May. For the single printing operation with completion of the operation used as the criterion for meeting the due date, the number of tardy jobs reduces from 25 to 20 and the average tardiness reduces from 5.1 to 1.6 workdays. The reduction in the number of tardy jobs occurs because some jobs that had their due date adjusted to the first day of the study finished the same day. Results that are more refined than the above figures based on dates, which came from the production database, are obtainable from the completion and arrival times observed during the study and shown in Figure 51 to Figure 56. All times were proportionally adjusted to the simulated day (Figure 60). If jobs are considered to be due at the end of production day, the number tardy was 25 (42%). This corresponds to the record of production. As jobs that were due before the start of the study were set to zero in the simulation, they must remain tardy even if they were finished during the first day. If completion by 4 p.m. was the criterion for meeting the due date, then the number of tardy jobs increases to 27 (46%). Two jobs completed on the due day finished after 4 p.m. For both ways of calculating tardiness, the average tardiness was approximately 1400 simulated minutes and is equivalent to 1.7 simulated days.104 In the simulation, applying the 4 p.m. criterion for due time produced 20 tardy jobs (34%) for the better performing heuristics. That is, there were seven less tardy jobs in the simulation than the actual production (Table 20). The difference in performance is starker if the 19 jobs for which tardiness was unavoidable are discounted. Unavoidability was due to these jobs having negative slack time on arrival (Figure 62). The average tardiness for the schedule constructed by Neil was between 2 and 6.5 times longer than that found for the simulation, depending upon the heuristic (Table 21). If the criterion for meeting the due date is shifted to the end of the day, the slack time on arrival for one job changes from negative to positive. Therefore, under this condition, 18 jobs were unavoidably tardy. As this equals the number found using the EDD, SLACK and R&M (Table 22), the schedules constructed by these rules are

104

The average length of the simulated day over the period of the study, ignoring the limited weekend production, was 860 minutes (Table 19).

208

optimal. By only varying the expected due time slightly, optimality was attained. Thus, the simplified scheduling problem is not difficult from the perspective of combinatorial complexity. 14 12 Frequency

10 8 6 4 2

0 10 00 20 00 30 00 40 00 50 00 60 00 70 00 80 00 90 00

-3 00 0 -2 00 0 -1 00 0

0

Slack (mins)

Figure 62. The distribution of slack time on arrival. Table 20 Number of Tardy jobs (modified arrival times and due date at modified 4pm) Heuristic

Objective Value

% Deviation

FCFS

26.0

30.0

SPT

20.0

0.0

LPT

28.0

40.0

EDD

21.0

5.0

SLACK

23.0

15.0

R&M

20.0

0.0

BENCHMARK

20.0

0.0

Table 21 Average Tardiness (modified arrival times and due date at modified 4pm) Heuristic

Objective Value

% Deviation

FCFS

395.0

73.0

SPT

398.0

74.0

LPT

669.0

192.0

EDD

268.0

17.0

SLACK

297.0

30.0

R&M

223.0

-2.5

BENCHMARK

229.0

0.0

209

Table 22 Number of Tardy jobs (modified arrival times and due date at end of day) Heuristic

Objective Value

% Deviation

FCFS

21.0

17.0

SPT

19.0

5.6

LPT

26.0

44.0

EDD

18.0

0.0

SLACK

18.0

0.0

R&M

18.0

0.0

BENCHMARK

18.0

0.0

Table 23 Average Tardiness (modified arrival times and due date at end of day) Heuristic

Objective Value

% Deviation

FCFS

354.0

64.0

SPT

387.0

79.0

LPT

600.0

178.0

EDD

260.0

21.0

SLACK

286.0

32.0

R&M

218.0

1.0

BENCHMARK

216.0

0.0

For each heuristic, the machine utilisation is found by comparing its makespan, shown in Table 24, with the absolute minimum possible. This occurs when there is no idle time. It is found by dividing the total processing time by the number of machines: åi pi 12530 = = 3133 min . m 4

All the heuristics produced high utilisation between 89% and 99% (Table 25).105. While utilisation in the simulation was higher than the 86% ratio of manufacturing

105

Utilisation is between 81% and 96% for the simple heuristics and 99.6% for the benchmark.

210

time to available time for the actual production, the value for SPT, which is a commonly applied heuristic, was similar to that observed at Melamed.106 Significantly, high utilisation for both the actual and simulated systems shows that it was not difficult to keep the machines operating in this scheduling environment. Table 24 Makespan Heuristic

Objective

% Deviation

Value FCFS

3.29E3

4.5

SPT

3.53E3

12.0

LPT

3.16E3

0.4

EDD

3.23E3

2.6

SLACK

3.16E3

0.4

R&M

3.49E3

11.0

BENCHMARK

3.15E3

0.0

Table 25. Utilisation Heuristic

Utilisation

FCFS

95%

SPT

89%

LPT

99%

EDD

97%

SLACK

99%

R&M

90%

BENCHMARK

99%

The simplicity of the model gives it an unfair advantage: nevertheless, it provides an extreme bound on performance that cannot be surpassed in the real domain.

106

The machine utilisation, in the model, and the ratio of the total manufacturing time to the available time, for the actual production, both measure time lost waiting for jobs.

211

However, this model cannot be used to construct schedules for the shop. For it to do so would require the model to include sequence-dependent set-ups and the different capabilities of the presses. As discussed in Section 5.3, this situation is far more difficult than the equal-machine problem without set-ups. Neil’s ability to anticipate arrivals allows him to predetermine scheduling sequences before jobs actually arrive. In contrast, dispatching heuristics only consider jobs that are currently available for processing. Neil’s poor performance in relation to tardiness probably could be improved, especially since he never applies any OR heuristic that is known to reduce tardiness: however, the extent is difficult to judge. While Neil’s estimates of manufacturing time provide a guide for allocating time slots on machines, often there is a mismatch between estimates and actual values. This occurs because of the high standard error across the range and the skewness at the lower end. Where estimates are significantly lower than the actual values, jobs take longer to complete than expected. Where machines are highly utilised, there is little spare capacity for rescheduling jobs. Therefore, there is little opportunity to make partial repairs that leave most of the original schedule intact. Instead, failure to meet the anticipated completion time extensively affects the schedule. Consequently, Neil may find it difficult to repair the schedule so promised delivery dates are met. Poor estimates of manufacturing time also affect determination of slack time remaining. For critical jobs, Neil calculates the latest time that printing can start for their due date to be met. Besides the estimates of the manufacturing time for printing, this requires estimates for the secondary operations. In scheduling critical jobs, Neil includes a buffer time to provide sufficient leeway to cater for the inaccuracies in the estimates.

5.8

Drawing the Threads of the Discussion Together

Production scheduling is the management of manufacturing constraints in a way that maximises desired outcomes. Constraints set the limits on the activities transforming the state of the schedule. In six case studies, McKay, Buzacott and Safayeni (1989) found several hundred types of constraints that practising schedulers considered. There are both external constraints imposed by the customer and internal constraints.

212

The customer specifies the physical requirements of the job (e.g., artwork, the number of forms, sheet dimensions and quality) and the due date. Internal constraints come from the capabilities and limitations of the machines (i.e., the technical constraints) and availability of resources (i.e., machines, humans and raw materials). Figure 40 shows the Ends-Means relations between the physical functions and physical resources. In designing a collator, the designers constrain physical activities to fulfil specific physical purposes (e.g., rollers and guides constrain the movement of paper). The degree of constancy varies between constraints. Some technical constraints, especially the primary technical capabilities of machines are immutable (e.g., a collator does not print), while others may change (e.g., the size of the cylinders). Under some circumstances, even constraints normally taken as immutable may be changed: perhaps by, moving parts from one machine to another, adjusting settings in a non-standard way or by modifying parts. Some constraints change frequently: availability of raw materials and labour may change over the day. Some constraints arise from previous scheduling decisions: jobs already scheduled place time reservations on machine availability. The printing shop studied is inordinately complex from an Operations Research perspective. There are five presses in parallel, four of which are essentially the same except for the number of colours each can print and some minor differences in capabilities (e.g., only one press incorporates a UV fixer). To manage the allocation of jobs to these presses the scheduler is bound by numerous technical constraints (shown as IF-THEN rules in Table 5). The behaviour of Neil, the scheduler at the time of the investigation, was studied to see how he addressed the scheduling task. A discursive approach was used to analyse the data collected through interview and observation. Although it does not have a definitive methodological structure that allows findings to be unequivocal, it affords insight into the complexity and the perplexity of scheduling an actual job shop. The analysis showed that Neil focussed on goals that operational steps under his control could directly affect. As he manipulated the schedule to move it closer to these goals, he produced a schedule that also tended to be closer to other, more abstract, goals. For example, Neil might manipulate the schedule to try to obtain low idle and set-up times for the presses and collators. This activity also would move the

213

schedule closer to the higher level goal, full utilisation. Similarly, full utilisation is a subgoal of a more abstract goal, productivity maximisation. Unlike OR models, Neil’s goals were manifold and hence the operational policies that he might have applied were numerous. Neil did not simultaneously pursue all goals. Which set of goals he pursued varied over time. For example, goal 8A, “Invoice all jobs in the same month paper is purchased,” is not at the focus of attention early in a month, but becomes dominant near its end. This concurs with the findings of McKay et al (1989): “Usually not all constraints and goals are active at the same time. As time goes on — hour by hour, day by day — different goals must be satisfied and addressed while dealing with changing constraints.” That is, a schedule judged to be “good” when generated might be a “bad” schedule if generated later the same day. In using a specific policy, Neil transformed the state of the schedule to a new state that was closer to the related goal. If he were to seek to meet one goal at a time, his operational policies might be similar to those shown in Table 11. Often, however, he sought to satisfy many goals concurrently. Under these circumstances, treating scheduling activity as a string of operational policies that apply to each goal taken individually may inadequately represent the scheduling process. A series of transformations, each of which improves the schedule in relation to a set of goals, may be a more apt representation of Neil’s behaviour. Each transformation consists of a series of steps that moves the state of the system closer to the set of goals. This is particularly evident where goals are disparate. For instance, to satisfy combined goals of utilisation and tardiness, the procedure Neil followed was distinctly different to the procedure for either goal. There is a hierarchical structure to the goals (Figure 44). At the lowest level of the hierarchy are goals A, the operational objectives, that directly guide the building of the schedule. With some of these, it is clear when they have been satisfied. Examples are “Complete all jobs for a customer concurrently” and “Particular jobs meet their due date.” For goals that consist of fuzzy terms, such as “low” or “little,” this is less clear and depends on the context. Above this are goals B, to which the scheduler may directly attend; however, the extent of the focus varies between goals. At the higher levels are goals that indirectly relate to shopfloor parameters. The focus is on shortterm financial viability and customer patronage. At the top of the goal structure is the raison d’être of the company, the maximisation of long-term financial return. It

214

depends upon short-term financial viability and customer patronage, which are the goals that are immediately below it. The higher a goal is up the hierarchy, the less it directly relates to immediate shopfloor activity. High level goals tend to be attained through satisfaction of low level goals, rather than by the direct attention of the scheduler. Nonetheless, Neil sometimes directly considered them when making scheduling decisions. In Figure 44, directed arcs into a goal indicate that as its underlying subgoals move towards satisfaction, it will also tend to move towards satisfaction. They do not depict direct causation. A goal may not necessarily be violated (i.e., the state of the schedule has moved further away from the goal) if some of its underlying subgoals are violated. For example, changing the cylinder size violates goal 2A, “Low press set-up time,” and seemingly reduces machine utilisation, goal 1B. However, if the cylinder is not changed, no available jobs may exist for the current configuration of the press. The press would then become idle. Consequently, goal 1A, “Low press idle time,” would be violated. So the scheduler may deliberately violate goal 2A in the endeavour to maintain high machine utilisation. Arcs not only link goals on different levels of the hierarchy, but also within a level. The directed arc into a goal shows that it is constrained by the goal to which it is linked. Therefore, in pursuing goals that constrain other goals, the scheduler may need to consider any adverse effects that may arise in other parts of the goal structure. Pursuing maximisation of productivity, goal 1C, the scheduler may seek to maximise processing speed, goal 2B. As goal 2B is linked to goal 9B, pursuing maximum speed may adversely affect the quality of the work that the customer deems to be acceptable. Therefore, the scheduler has to consider what is an acceptable lower bound on performance for goal 3C, to maximise satisfied customers, when pursuing maximisation of productivity. The presence of multiple goals is common to systems that have high dimensionality. In discussing the psychology of human behaviour in complex systems, Woods (1988) states that the multiple relevant goals can compete or constrain each other. Furthermore, scheduling the job-shop shares characteristics with other complex systems. It has high uncertainty in the data and the future states and events are not completely predictable. Woods argues that dynamic, event-driven worlds, which scheduling indeed fits, require opportunistic and flexible problem solving. The decision-maker has to assess the situation from events as they arise.

215

In scanning jobs that are ready to be scheduled, the scheduler may see a dominant pattern in the value of attributes across jobs, for example, jobs requiring the same size cylinder and having the same width. This pattern is a sign that activates a policy that meets goal 2A, “Low press set-up time.” Its recognition concords with the discussion of the recognition process in Chapter 4. From this perspective, the scheduler’s decision-making process consists of a series of recognition-action cycles that exercise: 1. Experiential knowledge (e.g., in balancing multi- and single-part jobs on the presses to maintain the flow of work to the collators while minimising waiting time); 2. Recognition of patterns amongst the jobs (e.g., patterns in job attributes on which to group jobs, such as, depth of paper and customer); 3. Inferential decision-making (e.g., the production rules associated with the technical constraints). Neil assessed the situation from events as they arose and applied multiple goals opportunistically. The moment-by-moment configuration of the system and the currently feasible alternatives determined which policy to apply. Commonly, there were several patterns in the data that relate to different signs. Which sign Neil attended depended on the opportunities they presented. That is, he had to decide which goals were dominant. Neil’s behaviour concurs with other experimental findings. Haider, Moodie and Buck (1981) found that experimental subjects, using an interactive simulator, opportunistically switched between scheduling policies. An opportunistic approach to scheduling is not restricted to human schedulers. AI researchers also have used opportunism to focus attention on the most critical activities of a schedule (see Chapter 3). Besides normally understood events such as the arrival of plates for a job, failure to meet a goal can also trigger scheduling activity. For example, if the metric for “Low press set-up time” were too high, then the scheduler may try to improve the schedule. He/she would scan the jobs to find strings of jobs with the same values for the set-up attributes (e.g., the same colours) but not currently grouped together. Sometimes Neil anticipated the arrival of a job and allocated it to a press before its arrival. If its plates had not arrived by the time the tag had progressed to the front of the queue, the press might become idle. He could easily tell when this occurred as he 216

marked the tag on receipt of the plates. A tag so marked that was at the front of the queue could act as a trigger for scheduling activity directed towards the goal associated with idle time. In addition, very few jobs in a queue might also trigger the activity towards this goal. As these situations could be seen clearly on the machine loading board, Neil could easily judge the extent that the schedule was not meeting this goal: whether the queues for one or more machines were in this state. Three examples illustrate distinctive aspects of Neil’s attunement to goal performance. During the construction of a schedule, it may become obvious that a particular job was not going to meet its due date. If this goal were imperative for the specific job (goal 13A), then its failure would trigger further scheduling activity to rectify the situation. In some circumstances, the need to satisfy other goals and constraints may result in a schedule that violates this goal for one or more jobs. Whenever this occurs, Neil is clearly aware which jobs are affected. Moreover, since it is usual for only a few jobs to fit this category, it is plausible that he knew exactly how many there were. While it is essential that particular jobs met their due date, Neil aimed to have all jobs delivered by their due date (goal 8B). The number of tardy jobs would be a measure of this goal’s performance. Though aware which jobs would be tardy if a particular schedule was followed, it was clear from the study that he usually did not know extempore how many were tardy. Therefore, the trigger is qualitative. On seeing many tardy jobs, he may try to reduce them. He was absolutely unaware of a common OR measure of performance, the average tardiness. For goal 1B, “Fully utilise all machines,” he did not measure the utilisation. While he did not measure performance, his practice of never leaving a machine idle triggered a feedback process that ensured performance was high. For other goals, Neil neither mentioned nor obviously applied any measure of performance. Clearly the information on performance that Neil could readily access was quite limited and varied in quality. If Neil could have easily observed the average tardiness during the schedule construction, then the unsatisfactory value might have triggered further scheduling activity so that its performance might have come closer to the Parsifal benchmark. To assess a schedule’s performance, a suitable metric is required for each goal, for example, a metric for “Low press idle time” would perhaps be the total idle-time per machine per shift. The lowest level of the hierarchy measures how well the schedule 217

meets operational objectives. Just as temperature and pressure in the power plant discussed in Chapter 4 are state variables that signal the system’s state, these measures also function as signals as they identify the state of the schedule. The scheduler can identify the scheduling state variables for particular machines and jobs. Performance measures for the goals at the next highest level are derived from the operational states. As signs, they indicate whether the schedule’s performance is within normal bounds. This is analogous to the indication in the process industry that the system is at steady state. Just as the supervisory controller would then know that the functional purpose of the system is being met, a scheduler who is experienced in interpreting these measures knows whether the schedule’s performance is within acceptable bounds. For two goals at this level, standard OR metrics are pertinent. They are “Fully utilise all machines” and “All jobs delivered by their due date,” with suitable metrics being percentage utilisation and average tardiness, respectively. Metrics for the qualitative goals are more problematic. A schedule that performs well for these metrics will support the attainment of the higher level goals associated with productivity, cash flow and satisfied customers.107 This in turn leads to short-term financial viability and repeat-custom and thereupon contributes to the maximisation of the long-term financial return.

5.9

Conclusion

In this chapter, the scheduling activity of Neil, a scheduler at a printing company, was studied. Neil did not approach the scheduling of production using Operations Research methods. Indeed, even if he were aware of OR approaches, which he was not, there is no method available that applies to the prevailing conditions. Many factors besides those associated with OR scheduling rules had an influence on his decision making. In allocating jobs to machines, he had to operate within the bounds of many constraints. Scheduling activity was triggered by some event, for example, the arrival of new jobs or a telephone call from a customer. On scanning the

107

Other factors than those directly associated with scheduling also affect the satisfaction of these goals.

218

attributes of both allocated and unallocated jobs, Neil found patterns in the data that triggered his interest in one or more goals. He then applied an operational policy that would tend to satisfy the chosen goal function. Where no policy is available, Neil would first work out the procedural steps and then execute it. To gauge the performance of Neil’s schedules for standard OR metrics of machine utilisation and tardiness, a benchmark was sought by reducing the problem sufficiently for simulation using standard OR heuristics. The scheduling problem was reduced to four parallel machines equal in capacity and capability and always available. Set-up times were also ignored. Very high machine utilisation for both actual and simulated systems indicates that it was not difficult to keep the machines operating in this scheduling environment. Therefore, machine idleness is not difficult to maintain, although, it was for Neil a primary concern as he had to ensure that he kept jobs flowing to each machine.108 A significant weakness of the simulation was the inability of Parsifal to deal with set ups, which was a primary focus of Neil’s scheduling activity: but, Parsifal was the only simulation software that was available. In ignoring time spent in setting up the presses, the difference between the actual and simulated tardiness was considerable. If scheduling with set ups were to be simulated, a more realistic lower bound on performance relating to tardiness could then be obtained and hence Neil’s scheduling performance relative to standard scheduling heuristics could be measured.109 Neil’s goals were normally immediate targets. An example is the situation where he observed the cylinder size required by the last job in a queue. He scanned the

108

Similarly, Dutton and Starbuck (1971) found in their observation of Charlie, the scheduler of fabricators, that from his viewpoint, the plant is a consumer of his schedules. To know when had to next attend to the development of a schedule for a particular machine, he had to be able to predict when it would run out of orders and become idle.

109

The ProtoHIPSS discussed in Chapter 7 is capable of performing such a simulation, but in its current state of development is unable to take the record of dynamic arrivals for past data to use for a benchmark comparison.

219

available jobs for those that met the target, “no change to the set-up of the machine.” When scanning the job tags on the machine loading board, he observed patterns in the values of job attributes. However, he could not immediately access the jobmachine attributes that form only when a job is allocated to a machine. For example, the time to process a job is a function of the size of the job and the speed of the machine. Therefore, to determine the processing time, he had to consider the speed of the press to which he intended to allocate the job.110 This limitation would be overcome, if Neil used a Gantt chart instead of a machine loading board. Its attainment may be at the expense of the job attributes, as displaying their values is difficult on a Gantt chart. By directing his attention to the constraints and the immediate targets, Neil produced workable schedules. A target itself could trigger scheduling activity to improve performance where he could easily observe how well the schedule performed for the target. This was only possible for a few targets, and, the quality of these was wanting. The targets that Neil focussed on were operational objectives of higherlevel unstated goals. For example, the operational objectives “to keep machines from being idle” and “to have low set-up time” both relate to a higher-level goal, the maximisation of machine utilisation. By analysing Neil’s scheduling activity, a goal structure was developed that shows the relationship between the operational objectives Neil sought and higher-level goals. The lowest level in the goal structure shows the actual operational objectives that Neil followed. The goal structure is a mapping of plausible goals at various levels. The known practices were linked to a presumed aim of a rational firm, “the maximisation of long-term financial return.” Other mappings may be formed that are equally credible in supporting Neil’s practices. For example, at level B, the minimisation of flowtime may be a suitable goal. It could be achieved through the minimisation of machine idle time (goals 1A and 3A), set-up time (goals 2A and 4A) and WIP (goal 7A). It is postulated that scheduling performance would improve if

110

That is, the entity relationship job A is processed on machine B is only created when the job is allocated to a machine.

220

the scheduler could be attuned to measures of performance that encompassed all goals. While the goal structure is for a particular domain and scheduler, it provides a basis for discussing the architecture of a hybrid intelligent production scheduling system (HIPSS). In Chapter 6, the work domain analysis and the activity analyses tools of CWA are applied to the data from field study to develop an architecture for constructing an HIPSS in Chapter 7.

221

Chapter 6 Cognitive Work Analysis of Field Data

The development of a hybrid intelligent production scheduling system (HIPSS), which can handle the degree of scheduling perplexity human schedulers commonly accommodate, depends on a sound understanding of how humans approach scheduling activity in a job shop. Scheduling production tasks within a job shop consists of planning the order of work on various machines. In allocating work to machines, schedulers commonly arrange tags representing jobs on a machine loading-board or on a Gantt chart. At one level, their work activity is the physical activity of placing and moving tags. While the technical constraints of the machines restrict feasible choices, the schedulers’ goals and their perception of the functional purpose of scheduling dominate decision making. Scheduling support should therefore go beyond merely showing the order that jobs are placed on the board, but should also support the schedulers’ decision activities. In this chapter, the work domain analysis and activity analysis tools of Cognitive Work Analysis (CWA) are applied to the data from the field study. The currently available CWA tools are found to be inadequate for representing human decisionmaking processes in discrete-event systems. To describe scheduling behaviour as an act of navigating through a sea of multifarious goals, new tools are developed to extend the current formalisms. The findings from this chapter form the foundation for developing an architecture for an HIPSS in Chapter 7.

6.1

Domain Characteristics

At Melamed, there were complex ends-means relationships between physical functions and physical devices. Functions could often be performed on more than 222

a single device as shown in Figure 63. For simplicity, the diagram shows a single Akira press to represent four presses; this representation, however, conceals another dimension of the complexity. The four Akira presses varied in their colour capacity and ancillary attachments, which gave them extra functionality. Hence, the number of presses on which a job can be processed varied between jobs. Setting up a machine for an operation was sequence dependent. A major delay occurs where the value of a particular attribute for a job differs from its predecessor. Other attributes triggered minor set-ups. There was a quick turnaround service for premium jobs. On the arrival of a ‘hot’ job, Neil, the scheduler, had to decide whether to pre-empt the current job. These, and other, constraints made the scheduling process go beyond the capabilities of standard OR heuristics. In this environment new jobs arrive intermittently, machines could break down and management often demanded specific jobs to be expedited. In planning work for a shift, Neil included anticipated arrivals, however there were others that were unanticipated. Sometimes the actual times jobs did arrive differed from the predicted. In planning beyond the immediate shift, Neil had some indication of expected arrivals, which was not at all comprehensive.111

111

Its extent is discussed at length in Chapter 5.

223

Fold Special Finish Procure Continuous Paper

Continuous printing

Output Fan-fold forms

Perforate Collate & finish

Cut

Cut into sheet Procure Sheet Paper

Output Sheeted forms

Sheet printing

Hunkeler

Store

Reel of paper

AKIRA Minami

Sanden Store

Stack of paper

Boxed Fan-fold forms

Bowe

Boxed Sheeted forms

Trident

Figure 63. Ends-Means relationships between physical functions and physical resources.

These characteristics are not atypical for a job shop. Vaguely defined constraints and uncertainty of temporal parameters are common (see Chapter 3). Different organisational units affected by the schedule may have different and conflicting goals. Goals often go beyond the limited set of overtly economic objectives (e.g., minimisation of operating stresses), and may be ill defined (e.g., maximisation of customer satisfaction). Scheduling methods either need to reflect explicitly the uncertain nature of the available information or give some guarantee as the insensitivity of the schedule to future information.

6.2

Cognitive Work Analysis Applied to Discrete Manufacture

The field study in Chapter 5 was posited on the systems-thinking context advanced in Chapter 4 in which schedulers are perceived to make decisions that are rational and goal-directed. Production schedulers are perceived to be practical persons who understand the capabilities of machines and work practices. Over 224

thirty years ago, Dutton (1964) used behavioural theory to explain how humans construct schedules. He viewed scheduling behaviour as the application of goals and search procedures to the sequential discovery and evaluation of alternatives. He saw behaviour as goal-directed and rule-ordered and a function of the situation in which the subjects find themselves. Rules were seen to change and to have logical and emotional antecedents. The sequential nature of search, perception, evaluation and revision he proposed is in sharp contrast to theories of behaviour that presume schedulers find a complete set of alternatives from which they select the best possible course of action. In modelling mental skill in the scheduling of production at a bakery, Beishon (1974) also realised that the activities of the person scheduling production were reliant on deep knowledge112 and were goaldirected. Dutton (1962) highlighted the difficulty of adequately modelling problem solving for business decisions. Descriptive models can be rich, full of meaning and nuance, but difficult to verify and to communicate findings at a level of general abstraction. In contrast, Dutton contends that analytical models often do not examine closely the internal properties of the problem-solving system. The aim of this research is to find a formal language for describing human decisionmaking processes that bridges the gap between descriptive and analytic models. A formalised description requires a systems-oriented method of analysis that encompasses both the engineering system and the problem-solving operations of the human decision-maker. In forming structural models of scheduling activities on which a human-computer scheduling system can be designed, it is not necessary to detail the actual mental processes engaged by schedulers. The scheduling strategies employed by the single scheduler in the field study, while highly specific to the domain, are at a general level congruent with the findings of McKay, Safayeni and Buzacott (c1992) of schedulers in the field. They found that the decision-making processes of practising schedulers constantly evolve, even during the scheduling horizon. Schedulers monitor situations for exceptions and indicators suggesting adjustments to decisions or decision-making processes. Constraints, objectives, and decision heuristics adapt as the manufacturing system changes. They also

112

Deep knowledge is discussed in Chapter 3.

225

make extensive use of non-manufacturing data for extended and unconventional practices. As their findings and the Melamed study are clearly comparable, the Melamed study is not anomalous. Certainly then, the Melamed study can be used as a footing for formation of models that describe how schedulers, working under perplexity (see Chapter 3), interpret signals from the environment and work out appropriate actions. Finding a means for describing scheduling activity applied by schedulers is the objective of this research, not discovering generalised rules or goals. The vocabulary used by McKay, Safayeni and Buzacott (c1992), to describe scheduling domains and activities, corresponds to the terminology of Cognitive Work Analysis (CWA) discussed in Chapter 4. There have been attempts by researchers to apply components of CWA to scheduling. Sanderson’s Model Human Scheduler (MHS), which is summarised in Chapter 4, is an activity analysis (AA) for manual scheduling. Other researchers have attempted to apply Work Domain Analysis (WDA) to scheduling. The AA is an event-dependent description of the scheduling process that takes place within the eventindependent constraints of the ‘ecology’ of the work domain. The WDA describes the operational constraints of the physical system. The WDA and AA are linked, but previous scheduling studies only consider one or the other. In WDA the means-ends abstraction hierarchy is a framework for describing the functional landscape in which behaviour, described using AA, takes place in a goal-relevant manner (Vicente, 1990).

226

Functional purpose

Efficient energy transfer

Priority/Values

Principles of heat transfer

Physical function

Physical device

Heat transfer secondary

Heat transfer primary

Purpose-related function

Temperature 1 in

Thermocouple 1 in

Temperature 1 out

Thermocouple 1 out

Flow rate 1

Flow meter 1

Primary circuit

Temperature 2 in

Temperature 2 out

Thermocouple 2 in

Thermocouple 2 out

Flow rate 2

Flow meter 2

Secondary circuit

Figure 64. State variables associated with a heat exchanger at different levels of abstraction.

In the discussion on supervisory control in Chapter 2, activities associated with maintaining performance in the processing of a batch were shown to be distinctly different to the activities associated with the changeover of batches. The function of supervisory controllers is different for each phase. During the processing of a batch, the goal is to achieve steady-state operation of the system. The intentionality of the automated system is embodied in its design. Through the designer’s exploitation of the invariants of physical laws, a system is designed for a specific functional purpose. For example, in the WDA for the heat exchanger — discussed in Chapter 4 and replicated in Figure 64 — the thermodynamic equations for convective and conductive heat transfer are used to specify the surface area for heat transfer and the cross-sectional area of the tube bundle. For the convective and conductive coefficients of heat-transfer for the fluids and materials in the heat exchanger, dimensional parameters are chosen so that under normal operating conditions a specified amount of heat is transferred from the primary to the secondary circuit. In other words, the constraints on the purposerelated function were set in the design to meet certain heat-transfer criteria at the level of priorities/values that ensure that the functional-purpose “to maintain efficient heat transfer” is upheld. The invariants project from one level to another.

227

The constraints placed on heat transfer map to constraints on the physical functions and thereupon on the properties of the physical devices. At changeover, activities associated with planning predominate.113 Scheduling is the most critical planning task. In scheduling, the intentionality derives from the scheduler’s purposes and values, which may include various subjective preferences of the scheduler and the organisation. The scheduler organises work to satisfice performance criteria that may be neither wholly quantitative nor completely specified. The AH in a WDA for production scheduling describes the manufacturing system as a set of resources organised to achieve defined goals; the resources are the means for achieving specified goals (Lind, 1988). In a WDA for Melamed, the ends-means relationships shown in Figure 63 form the physical function and physical device levels. The various devices linked to a particular function represent feasible alternatives: the ‘requisite variety’ that can be connected to serve all the relevant work situations (Rasmussen, 1998b). Which connection is instantiated depends upon the detailed situational conditions and the scheduler’s immediate subjective preferences. Decision making by the scheduler implies the consideration of potential alternative means-ends configurations (Rasmussen, 1988). The scheduler chooses a suitable set of relations between his/her actual intent and the resources available. If more than a single operation were represented in the WDA, for example, printing and collating, then there would the topological links between feasible machines and collators. Krosner, Mitchell and Govindaraj (1989) attempted to apply the Abstraction Hierarchy representation to the scheduling of discrete manufacture without success. However they misunderstood the nature of the AH. Kinsley (1994) asserts they wrongly believed that the AH represents the network of possible control activities and their effects — “things the controllers do.” Instead, the AH in WDA represents the domain constraints governing system behaviour. In Rasmussen’s abstraction hierarchy, ‘functions’ refer to system functions as the

113

See Chapter 2 for the range of planning activities.

228

means-end hierarchy represents the functional structure of a system (i.e., the constraints on achieving the system’s objectives). They also mistakenly thought, “abstract functions decompose into generalised functions.” Instead, the move between levels represents shifts in the vocabulary necessary to form a chain of potential relationships between the system’s functional purpose and the resources in the system (Rasmussen, 1998b). While it is obvious from their paper that they confuse activities and domain constraints, this criticism is too harsh. The devices and functions they list under physical form, physical function and generalised function in Table 26 are indeed system functions. They meet the requirement of WDA: from any level, the levels above and below specify ‘why’ and ‘how,’ respectively (Vicente and Rasmussen, 1990). For the ‘insertion’ at the physical function level, a particular workstation provides the ‘how’. An explicit manufacturing step, for example, ‘join,’ would specify ‘why’ the function insertion is needed. Table 26 Abstraction Hierarchy from top-down and bottom-up perspectives (Krosner, Mitchell and Govindaraj, 1989).

Top-down Minimise costs, Meet production goals Make parts, Reduce deviation from target dates, meet production schedule

??

Abstraction Level Functional Purpose

Bottom-up

Abstract Function

??114

Generalised Function

Move a part, Perform a manufacturing step DIP/SIP/MOD insertion, Part transportation via AGV Location and state of each workstation, part, and transportation resource

Physical Function Physical Form

For low levels of abstraction, Krosner, Mitchell and Govindaraj (1989) could describe their system by working up from the level of physical form. Links

114

These question marks are shown in the original and signify that it is unclear how to change to this level of abstraction from the previous level.

229

between levels show the constraints placed on the ends. Similarly, for high levels of abstraction they could work down from the level of functional purpose. Topdown links between levels are teleological: they express the means to meet purpose. They came across the problem that Sheridan (1988) highlighted: teleology “runs out of gas part way down and physical constraints run out part way up.” Krosner, Mitchell and Govindaraj (1989) could not make links between generalised functions and abstract functions. At the intermediate levels, disjunction between teleological and constraint-based descriptions denotes conceptual gaps in the representation due to an incomplete description of the system. Kinsley (1994) developed a WDA for the scheduling of an Advanced Manufacturing System (AMS): a fully automated system of machines, which processed discrete parts, and carts for transporting the parts between machines (see Figure 65). The different levels of the WDA show the ‘hard’ technical constraints — the capabilities of each machine — forming the system boundaries that cannot be violated in the generation of feasible schedules. For example, the generalised function, cleaning, is performed by the physical function, washing, which is afforded by the washing station. If a part requires cleaning the scheduler must include the washing station in the route. Significantly, there are no meansends links between the function ‘scheduling’ at the generalised level and any physical function. Kinsley acknowledges that it is somewhat different from other functions. As the scheduling function may change other elements without any input from an external controller, she sees the system is partially self-organising at the generalised function level (Kinsley, Sharit and Vicente, 1994). At the abstract function level there are functions of mass, time, monetary value and job priority. Each kind of topology represents one aspect of the system goal: minimising time increases timeliness; maximising value increases productivity. Since the same physical apparatus implements all abstract topologies, they are highly interdependent. The priority topology is quite different from the other measures as it includes sinks and sources that do not correspond to system components, but are instead the result of decisions made by a supervisory controller. Because changes in priority arise from external decisions and not from constraints within the system, Kinsley found it impossible to represent factors that

230

control priorities as part of WDA. Notwithstanding Kinsley’s work being a preliminary expression of a WDA for which she makes no claim of completeness, it highlights the difficulties in locating the scheduling function and in forming means-ends links.

231

Abstract F unction Mass Inventory

M ass Inventory

M ass Source

Mass Inventory

Mass Inventory

Mass Inventory

M ASS Time Sink

Mass Sink

M ass Inventory

Time Sink

Time Source

Time Sink

Time Sink

Time Sink

TIM E Value Source

Time Sink

Time Sink

V alue Source

Value Source

Value Source

Value Source

Value Inventory

VALUE Priority Inventory

Value Sink

Value Inventory

Priority Inventory

Priority Source

Priority Inventory

Priority Inventory

Priority Inventory

PRIO RITY

Priority Sink

Priority Inventory

Generalised Function M achining I

M achining II

Set-up I

Set-up II

Machining V Set-up V

Scheduling

Pallet Entry

Machining VI Set-up VI

M aterials Handling

Pallet Exit

Quality Control

Cleaning

Physical Function

Cart Control

Loading

Drilling & Tapping

Finish Milling

Rough M illing

Transportation

Unloading

Inspe ction

Washing

Physical Form M achining Centre I

M achining Centre II

M achining Centre III

Machining Centre IV

Rail-guided Cart

Load/U nload D ock

Lathe I

Lathe II

Washing Station

Coordinate M easuring M achine

Figure 65. Kinsley’s (1994) Abstraction Hierarchy for an Advanced Manufacturing System.

232

As each job has to be scheduled, the scheduling function is clearly a crucial aspect of production control. If scheduling were performed manually, instead of automatically, one would expect that the human scheduler would be shown as a physical device in the WDA. The person would be linked to the scheduling function at the generalised level (i.e., the purpose-related level) through a meansends chain. For many manual systems, treating humans as physical devices is not problematic. In a WDA describing the flow of goods in a supermarket, cashiers may be regarded as physical devices. The cashier enters the prices of goods, collects money from the customer, packs the goods and gives the customer the change. The cognitive aspects of the cashier’s decision-making process do not need to be expressed in the WDA. However, to move tags on a machine loading board, schedulers must operate at a high level of abstraction. Their moves are constrained by the functional purpose of each machine. In using the board, schedulers are mindful of complex interacting goals and use various heuristics to try to satisfy them. Recognising that scheduling is an activity of an agent — either human or computer — resolves the problem. Consequently scheduling is the subject of Activity Analysis (AA) and not of WDA. This leaves another question to be resolved: where and how should the job itself, the key focus of scheduling, be represented? In Chapter 3, a job was depicted as a set of attributes that describes the final product. These informational constraints are transformed by a manufacturing system into a tangible product required by the customer. In other words, the job specifies the purpose of the system as constraints on its operation. For any particular job, the purpose-related level consists of the specification for the manufactured components. By meeting the set of constraints on the physical parameters (material, geometry, batch size, etc.), the product will be produced as required by the customer. The purpose-related function changes with the change in jobs. Hence, WDA in production scheduling focuses on the reconfiguration of the system to meet the purpose-related function at the changeover of jobs (see Chapter 2).

233

6.2.1 CWA Applied to the Field Study Functional purpose

Maximise long-term financial return

Priority/ Values

Maximise short-term financial viability

Maximise repeat custom

Job n Process paper to specified attributes

Purpose related function

Print Physical function

Physical device

Perforate Cut

Fold

jobno

Cust

To Plates

dd

D

W

Q

Plates available

Print Fold

Perforate Cut

Parts

SbS

FOB1

••• FOB4

Print

Perforate Cut

Fold

Cross Perforator

Ink

Cross Perforator

Ink

Cross Perforator

Plates

Cross Perforator

Plates

Cross Perforator

Plates

Cross Perforator

Cyl.

Cutter paper speed

BOB4

Print Fix

Fold

UV

Cut

Cross Perforator

Plates

Cross Perforator

Cyl.

Cutter paper

Folder speed

AKIRA2

Perforate

Ink

Cutter paper

Folder speed

AKIRA1

Cyl.

Cutter paper

Folder

•••

Paper

Ink

Cyl.

BOB1

Folder speed

AKIRA3

AKIRA4

Figure 66. Work Domain Analysis for Scheduling the Akira presses at Melamed. Table 27. Legend for the abbreviations for the job attributes.

Jobno Cust dd D W Q Parts SbS FOBx BObx To plates Plates available Paper

job number customer due date depth of form width of form quantity number of parts print side by side colour x on front of bill colour x on back of bill date sent to plate manufacture yes/no paper type

The ends-means relationships between physical functions and physical resources in Figure 63 form the levels of physical function and physical device of an AH for the WDA of production control at Melamed (Figure 66). To reduce the

234

complexity of the scheduling problem for the sake of manageable discussion, the scheduling of the Akira presses is only considered. Yet, the scheduler could not ignore how decisions regarding the presses affect the status of other machines. In planning work on a press, Neil considered the impact of the schedule on the workflow at machines used for subsequent operations. The functional purpose of production control at Melamed was to “maximise the long-term financial return” of the company. To meet this purpose, manufacture was organised to “maximise short-term financial viability” and “maximise repeat custom.” These are shown at the priority/values level of abstraction. Functional purpose

Maximise long-term financial return

Priority/ Values

Maximise short-term financial viability

Maximise repeat custom

Job n Process paper to specified attributes

Purpose related function

Physical function

Cust

To Plates

dd

D

W

Plates available

Q

Parts

SbS

FOB1

••• FOB4

Perforate

Print

Perforate

Print

Perforate

Fold

Cut

Fold

Cut

Fold

Cut

Cross Perforator

Plates

Cross Perforator

Cyl.

Cross Perforator

Ink Plates

Cross Perforator

Cyl.

AKIRA1

Plates

Cutter paper

Folder speed

Ink

Fix

Cross Perforator Cross Perforator

Cyl.

Cutter paper

BOB1

•••

BOB4

Paper

Print

Ink

Physical device

jobno

Print

Perforate

Fold

Cut

Ink UV

Plates

Cutter

Cutter paper

Folder

speed

speed

AKIRA2

Cross Perforator

Cyl.

paper Folder

Cross Perforator

Folder speed

AKIRA3

AKIRA4

Figure 67. Feasible means-ends links for a particular job specification.115

Processing paper-based products to the requirements specified by the customer (i.e., the job attributes) is the purpose-related function of the job shop. The

115

At the physical-function level, the four nodes could have been reduced to one. There would then be links to each physical device. Representing the work domain in this way would produce clutter with the consequence that readability may be compromised.

235

attributes set the constraints on production for a particular job; the abbreviations for the job attributes shown in Figure 66 are given in Table 27. For each job, links from the set of constraints, that is, the specified attributes, to particular physical functions represent the feasible alternatives. All constraints on the purpose-related function that may map to constraints on the physical function must map to a single aggregated node for a link to exist.116 For example, if a job requires four colours, there must be four links from the front-of-bill (FOB) and back-of-bill (BOB) nodes at the purpose-related level to four-colour nodes within the aggregated physical function ‘print,’ which in turn map to four separate applicators within the ‘ink’ device as shown in Figure 67. To produce a four-colour job, links to Akira 2 and Akira 3 are only possible as both Akira 1 and Akira 2 have insufficient applicators. The hard technical constraints of the machines, which are causal constraints, project up to the purpose-related function level. Means-ends links only form where the causal constraints match the requirements of the purpose-related intentional constraints. For instance, the number of colours required for the job must be within the constraint boundary of the number of colours that the press is capable of producing. As the physical system can be ‘redesigned’ by changing the set up of the machines, intentional constraints at the purpose-related level set constraints on the physical device. For instance, a press’s cylinder size is changed to meet the constraints set by the depth of paper required by the job.

116

The temporal constraints (e.g., due date, date sent to plates and plates available) in the specification of the job, which are shown on the job tags, are not physical constraints and consequently do not map between levels.

236

structural sequence

Figure 68. Structural sequence of Abstraction Hierarchies.

As the configuration of the physical device must meet the purpose-related end, the AH is redesigned for each job as the purpose-related function changes. The AHs for different jobs can therefore be considered metaphorically as a batch of cards. The cards are associated with the two forms of supervisory control defined in Chapter 2: processing of a batch and changeover of batches. Each card shows the potential configurations for the processing of the batch defined by the job. Moving from one card to another denotes the changeover of batches. By reconfiguring the system in a goal-directed way, the scheduler takes on the role of system designer (Rasmussen and Pejtersen, 1995). In effect, the scheduler decides the structural sequence of jobs by ordering the cards (Figure 68). A particular ordering produces specific values of constraints.117 If in changing from one job to another, the machine’s configuration must change, then the values of the

117

The temporal constraints (e.g., due date, date sent to plates and plates available) in the specification of the job, which are part of the specified attributes shown at the level of purpose-related function, influence where the job is positioned in the batch of cards. For example, a job’s lateness, which a performance metric, depends upon the ordering as the start time is associated with card’s position.

237

constraints change between cards. ‘Shuffling’ cards is an activity that can be represented by the decision ladder. The order is instantiated, just as means-ends link is instantiated in an AH. Functional purpose

Maximise long-term financial return

Priority/ Values

Maximise short-term financial viability

Maximise repeat custom

Job n Process paper to specified attributes

Purpose related function

Physical function

Cust

To Plates

dd

D

W

Plates available

Q

Parts

SbS

FOB1

••• FOB4

Perforate

Print

Perforate

Print

Perforate

Fold

Cut

Fold

Cut

Fold

Cut

Cross Perforator

Plates

Cross Perforator

Cyl.

Cross Perforator

Ink Plates

Cross Perforator

Cyl.

Folder

AKIRA1

Plates

Cutter paper

speed

Ink

Cross Perforator

BOB4

Print

Perforate

Fold

Cut

Ink UV

Plates

Cutter

Folder

Cross Perforator Cross Perforator Cutter

paper Folder

speed

AKIRA2

•••

Cyl.

paper

speed

Fix

Cross Perforator

Cyl.

Cutter paper

BOB1

Paper

Print

Ink

Physical device

jobno

Folder speed

AKIRA3

AKIRA4

Figure 69. The means-ends chain that is instantiated is shown by the links at a composite level of granularity.

To model the activities of the scheduler associated with a particular job requires a series of decision ladders. One set of activities relates to the scheduler merely focusing on the job. By looking at the job’s attributes, the scheduler considers the possible scheduling choices. He has to understand the structural means-ends relations between the job attributes and constraints associated with the different physical devices. He considers the various means for achieving his desired ends: represented formally by an AH for the job. If the job was already allocated to a machine, then one ends-means chain is instantiated while other links show feasible alternatives. Another set of activities relates to the scheduler allocating the job to a machine. If the job is currently unallocated then a means-ends chain is instantiated. Figure 69 shows an instantiation that could be created by the scheduler for the four-colour example on the background of all feasible chains. However, if it was already allocated, then reallocation changes the chain that is instantiated. A third set of activities consists of scanning and shuffling. When AH 238

‘cards’ are juxtaposed, the state variables required for a job are compared to those of its predecessor. If the set up of the machine has to change, then the scheduler has to map these changes of state to value measures associated with the goal structure. The AH is a formal descriptor for WDA that provides a generic framework for describing goal-oriented systems (Vicente and Rasmussen, 1990). Similarly, Rasmussen’s decision ladder seemingly provides researchers with a formal descriptor for representing each set of activities in AA. In identifying the constraints on what needs to be done, the decision ladder is a template of generic information processing activity (Vicente, 1999).118 Sanderson’s MHS (Model Human Scheduler), discussed in Chapter 4, represents information processing activity for scheduling at the level of a generic framework. In the MHS, each path consists of a template of structural relationships that avoids the details of a specific scheduling behaviour.

118

Vicente categorically states that it is neither a model of actual human information processing nor a template of actual human processing activity.

239

Information processing activities States of knowledge resulting from informat ion processing Evaluate performance criteria

10

What are we aiming for?

Knowledge-based domain

6

Uncertain

Performance goal

5

11

What are the consequences, given goal?

Determine state consequences

5

9 9a

KNOWLEDGE BASED ANALYS IS

State 8

4

consequences

State consequences

8d

What is the effect on this state?

Choose criterion

8c

12

8a 8b

Identify state

Criterion

What policy satisfies criterion?

13

What do these data signify?

KNOWLEDGE BASED PLANNING

9c

9b

7c

Define policy: operationalise criterion

14

Data

7

7b Policy

7a

3

15

Observe information and data

What is happening?

Rule-based domain

2 Alert

How to carry it out? Determine steps: operationalise policy

16 Steps

1

17 Carry out

Activate

Skill-based domain

Figure 70. Model Human Scheduler (Sanderson, 1991).

Can the descriptive account of Neil’s activities, presented in Chapter 7, be recast so that it fits the formal representation of the MHS? From the MHS, one would expect that Neil’s behaviour would sometimes be skilled-based: he would immediately act on a signal from the environment without conscious control (path 1 shown in Figure 70). Though not observed during the field study, a plausible example of skill-based behaviour would be Neil expediting a late job immediately after receiving a telephone call from an irate customer. To follow this pathway, Neil must be able to readily find the particular job for the customer on the machine loading board. To quickly pick it up, Neil may scan for both the customer name and the job number, which uniquely identifies it. To find a press that can expedite the job without delay, he has to be able to observe the current configuration of each machine. His recognition of a suitable machine can be formally described as a comparison of the AH for each currently loaded job and the AH for the job to be expedited. Assuming that the urgent nature of the job

240

calls for the current job to be pre-empted, then the preferred machine would require the smallest time to set it to the desired configuration. In effect, the system is reconfigured from the current AH to the desired AH. In familiar situations, Neil’s behaviour was rule-based as he applied scheduling heuristics. By scanning job attributes listed on the job tags, Neil observed both the patterns across jobs and the system constraints (the current machine configurations). From the patterns in the data, normally he clearly knew which heuristic to apply; either path 7a or 7b in the MHS is followed. For example, he has used the procedures for reducing the set-up time so often that he can carry out the steps without reflection. When the patterns in the data were such that it was not immediately clear which policy he should follow, Neil had to take some effort in assessing the significance of the data. For example from the data, Neil may be unsure whether there would be greater advantage in grouping jobs for the same customer, so allowing a single delivery, or, complying with a scheduling policy that minimises average tardiness. Even if he decided to aim for minimal tardiness, it may not have been clear how to do so. Should he seek it through the minimisation of the total set-up time or by placing jobs in order of increasing processing time? To answer these questions, he would need to further analyse the state of the system. By scanning the job tags he would seek dominant patterns for making inferences; formalistically this is described as the shuffling of AH cards. If it is not clear which performance criteria he is trying to meet, Neil may have to contemplate what the criterion should be, and then define a heuristic that meets it. During unfamiliar situations for which no rules are available, the control would move to a higher knowledge-based level where goals direct decision-making. Neil would form goals by analysing the needs of the environment and his personal aims. From his perception of the relational structure of the causal environment and the work content, he would examine different plans, testing their effects against the goals. Such a situation arose on his first engagement with the single cylinder that produces 13 inch (330 mm) forms: I’ve one printing cylinder. The damn thing’s a mistake {Neil suspects that it was supplied in error when the company was purchasing cylinders}. It’s

241

a 13 inch deep form. I’ve got to do cross perforations on a machine that is really not designed for perforating…. I improvised before and run jobs, now [the manager] expects that I can do it. [To produce more than a single colour] you run it through the Akiras once, the Trident once, twice, three times.

Every strategy that Neil or any other scheduler follows or could follow, does not have to be included in the AA. Vicente (1999) differentiates between AA, what he calls control task analysis, and strategy analysis. While control task analysis is the determination of what needs to be done (the product of decisions), strategy analysis considers how it can be done (the decision process). The information requirements for each information processing activity, and not the actual location of each shortcut, are critical for understanding what the scheduler does, as they set the interaction between the information ‘out there’ and the person.119 Whether each shortcut in the MHS can be applied to Neil’s behaviour, or whether the MHS does not include some of his shortcuts will not be analysed, as these details are not central to the thesis.120 While the applicability of some shortcuts is clearly apparent, others can be deduced from the discussion in Chapter 4 or from Sanderson’s (1991) explication. The machine loading board brings visibility to some parameters (namely the job attributes, e.g. width and depth, colours, customers and the due date) relating to the ‘states of knowledge’ nodes in the decision ladder that are required for the subsequent information processing activity. In other words, it provides cues and ‘memory’ for decision-making, as explained by ‘situation theory’ that was discussed in Chapter 4. External memory substitutes visibility of external

119

Reason (1990) discusses the trade-offs regarding efficiency and error between knowledge-in-the-world (KIW) and knowledge-in-the-head (KIH).

120

This does not mean that shortcuts are unimportant; indeed, the particular insight of the decision ladder is its capability of representing the many situations where operators shortcut steps in a decision sequence (Sanderson and Harwood, 1988).

242

representations for memory storage, and therefore visual inferences replace mental operations (Lewis, 1997). In making decisions, various scenarios may have to be played out. When scanning job tags, Neil may have a primary intention in mind, for example, extend the queue of jobs for a particular press. In endeavouring to carry out this objective he aims, for argument sake, to minimise time wasted in setting up the press. Yet, in striving to follow his rules for reducing set-up time, he has to also explore a range of other influences. He might, for instance, find that the due-date for some jobs would not be met if he applied the set-up rules to all the jobs that he added. Other factors may compound the predicament, for instance, in adding a string of jobs that require the same cylinder, constraints relating to the instant-processing requirement for premium jobs may be violated, if the schedule already shows changes to the configuration of cylinder sizes across the presses. He would have to decide whether to strictly follow this soft intentional constraint, on the off chance that unanticipated premium orders would appear within the current schedule horizon. These scenarios accord with Klein’s (1989) Recognition-Primed Decision (RPD) theory, discussed in Chapter 4. In considering potential moves of the job tags he evaluated possible responses serially, reflecting on plausible goals, critical cues, expectancies, and typical actions. He did not merely recognise the signs for one particular rule that he then applied. In scanning the job tags, he mentally simulated all the factors that could come to play: he considered all relevant means-ends relations, which can be described formalistically by AHs. Lind (1991), in his critique of the decision ladder discussed in Chapter 4, highlights a serious shortcoming when there are competing hypotheses and interpretations. In the above analysis of Neil’s behaviour using the MHS, there was no explicit reference to which performance criteria and goals the scheduler was considering when his behaviour was rule-based (e.g., shortcuts 7a and 7b). While the choice between single delivery to a customer and minimisation of tardiness may relate to performance criteria for customer satisfaction, policies that reduce set-up time do not. During the observation and identification cycles, the scheduler reflects on potential goals. In his decision ladder examples, Vicente (1999) makes extensive use of associative leaps from one state of knowledge to

243

another.121 Being direct associations, they do not require any information processing activity. Adding associative leaps from the observed data and the system state to either the ‘uncertain’ node or ‘performance goal’ in the MHS may resolve this incompleteness.122

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

Figure 71. The scheduling goal structure.

121

Once associative leaps are included, the structural relationships change so that the ladder is not the only representation of the decision-making pathways. The important factor is that the representation supports each elemental activity in the recognition-action cycles of decision making. The contribution that the ladder has added to understanding of decision making is that decision-making is not in a rigid serial order and the SRK paths vary with levels of theoretical and experiential expertise.

122

In contrast, the other type of shortcut, a shunt, connects an information processing routine to a state of knowledge (Vicente, 1991; Reason, 1988). If shortcuts are restricted to these two types, then the shortcut from the ‘alert’ to ‘carry out’ in the MHS is in error. As the actual locations of the shortcuts are not central to the present argument, this issue will not be explored here.

244

For the hypothetical case illustrated above, where Neil endeavours to extend the queue of jobs for a particular press, the goals changed as he considered the flowon consequences of moving job tags. The various goal choices that may be instantiated at the top of the ladder are interrelated through the goal structure developed in Chapter 5 and replicated in Figure 71. Sanderson (1998) uses two decision ladders arranged in either temporal or logical order, to depict activity that is shared between two agents (Figure 72). Lintern and Naikar (1998) depict extended activity using separate decision ladders for each activity. Figure 73 shows that in flying a light aircraft, the pilot’s activity alters as his/her goals change. Similarly, a series of decision ladders can portray Neil’s activities as the goals he seeks to satisfy change. Activity analysis in work domain terms

Measures of Merit

Humansystem integration

Priority(ies) and value(s) served by Activity 1 Agent(s) for 1

Activity 1

Priority(es) and value(s) served by Activity 2

Logic al or tempor al contingency Activity 2

Purpose-related function(s) that Activity 1 represents

Agent(s) for 2

Purpose-related function(s) that Activity 2 represents

Decision s upport GOALS

GOALS

Evaluate options

Physical device A

Evaluate options

OPTIONS

EFFEC TS

OPTIONS

Physical functi on A’

EFFEC TS

Predic t consequences

Predict consequences

Data display AC TUAL STA TE

Data tr ansformation Physical device A

Physical functi on A’

Identify state

DESIR ED STA TE

Choose task

INFORMA TION

Physical functi on A’

Obs erve data/evidence

Physical functi on A’

C hoose task

TAS K

Observe data/evidence

PR OCEDURE

System contribution to Activity 2

Identify state

DESIR ED STA TE

INFORMA TION

Plan

ALER T

Activation or input

Physical device A

TAS K

Data c ollecti on Physical device A

ACTUAL STA TE

Execute or output

ALERT

Activation or input

Activity analysis in decision terms

Plan

PROCEDUR E

Human contribution to Activity 2

Execute or output

Configurati on or communication Physical device A

Physical functi on A’

Figure 72. Activity as action sequence related to the work domain (shown in part at top) and activity in decision terms (shown as two decision ladders at bottom). Diagram shows a case where a function is effected by activity that is shared between human and system (Sanderson, 1998)

245

Land aircraft

Pre-Landing Checks

Maintenance of Descent Profile GOALS

GOALS

Evaluate options

Evaluate options OPT IONS

OPT IONS

EFFECT S

EFFECT S Predict consequences

Predict consequences

ACT UAL STAT E

Identify state

INFORMAT ION

Chec k t hat aircraft is configured f or landing (eg autopilot turned off, brakes t urned off, undercarriage l owered etc), runway is clear and landi ng clearance obtained

ACT UAL STAT E

DESIRED STAT E

Identify state

Choose task

INFORMAT ION

T ASK

Chec k rate of descent, speed, and altit ude

Observe data/evidence

Plan

ALERT

PROCEDURE

Activation or input

Execute or output

Turn autopilot off, turn brakes off, lower underc arriage, request l andi ng clear anc e etc

DESIRED STAT E

Appropriat e descent profile

Choose task

Change s peed, and/ or rate of desc ent , and/or altitude

T ASK

Increas e or decreas e speed and/or rate of descent, and/or alti tude

Observe data/evidence

Plan

ALERT

PROCEDURE

Activation or input

Execute or output

Adjust throt tle and control yoke

Figure 73. A Decision Ladder for landing a light aircraft, showing the prototypical activity associated with pre-landing checks (left) and maintenance of the descent profile (right) (Lintern and Naikar, 1998).

The above hypothetical example in which the length of a job queue is extended can be used to demonstrate how a series of decision ladders can represent the scheduling activity. Sometime in his ongoing activity, Neil observes the state of the system is such that there are not enough job tags at a particular press to cover the scheduling horizon. From experience, he associates this state with a target state to extend the queue length; there is no explicit reference to the underlying ultimate goal, ‘no press idle time.’ For this target, the task is to add jobs to the queue by executing a procedure that minimises changes to the press set up. These recognition-action cycles are elements of activity 1, the leftmost decision ladder shown in Figure 74.123 However as he executes the procedure, he is on the alert for adverse consequences of proposed actions. He therefore leaps to activity 2. In scanning the job tags, he observes various patterns among the job attributes. Among these he may identify some jobs he particularly wants finished by their due date that would not be met, because they do not satisfy the current selection criteria. He then resets his target: meet the due dates of these jobs and, as before,

123

The decision ladders are shown in a generic form without shortcuts to indicate that at this stage that the details regarding each activity are yet to be resolved.

246

extend the particular machine’s queue. He then redefines the task as meeting the cylinder size constraints but relaxing the width and colour constraints where necessary to include jobs that are to meet their due date. He then leaps to a known procedure and then starts to execute it. Yet again, during its execution he is on the alert for further adverse effects of proposed actions. In scanning jobs allocated to other machines, he may, for argument sake, identify that the current scheduling procedure would violate the configuration for immediate processing of premium jobs. To resolve the relative importance of competing goals, decision activity then moves to the top of the ladder. On figuring out appropriate performance criteria, he may then define a target state, associate a task to meet the target, and then carry out a procedure. Activity analysis in work domain terms Priority(ies) and value(s) served by Activity 1

Priority(es) and value(s) served by Activity 2

Logic al or tempor al contingency

Agent(s) for 1

Activity 1

Priority(es) and value(s) served by Activity n

Agent(s) for 2

Activity 2

Purpose-related function(s) that Activity 2 represents

Purpose-related function(s) that Activity 1 represents

EVALUATE p erformance criter ia

Purpose-related function(s) that Activity 3 represents

EVALUATE performan ce criteria.

EVALUATE p erformance criter ia

Which g oal to cho ose?

Wh ich go al to ch oose? AMBIGUITY

ULTIMATE

Which g oal to cho ose? AMBIGUITY

GOAL Wh ich is then th en t arget st ate ?

AMBIGUI TY

ULTIMATE

GOAL Which is th en then t arg et st ate ?

INTERPRE T con sequences fo r curren t task, s afety , efficiency, et c.

TARGET TARGET STATE STATE

IDENTIFY p resen t state of the system

SYSTEM STATE

Wh ich is the app ropriate chage in o peratin g co nd.?

IDENTIFY present state of the system

DEFINE TASK Select appropriate chan ge o f syst. con d.

W hat l ies b eind ?

Wh at is the effect on this s ta te?

TARGET TARGET STATE STATE

TASK TASK

DEFINE TASK

FORMULATE P ROCEDURE plan sequ ence o f actio ns

What’s going on?

ALERT

ACTIVATION Detect io n o f need for d ata p ro cessing

EXECUTE Coord inate manipulations

TARGET TARGET STATE STATE

Wh ich is the app ro priate chage in op eratin g co nd.?

DEFINE TASK Select app rop riate change of syst. co nd.

l iesd? bein d? WhatWhat l ies bein

SET OF OBSERV.

TASK TASK

SET OF OBSERV.

Ho w to do it?

OBSERVE informatio n & data

FORMULATE PROCEDURE plan sequen ce of action s

ALERT

ACTIVATION Detect io n o f need for d ata p ro cessing

TASK TASK

How to d o it?

OBSERVE information & data

FORMULATE P ROCEDURE p lan sequence of action s

W hat’s going on?

What’s going on?

PROCEDURE

IDENTIFY present state of the system

Select ap propriate ch an ge of sy st. cond .

Ho w to do it?

OBSERVE informatio n & d ata

SYSTEM STATE

Which is the appropriate chag e in operating con d.?

W hat l ies b eind ?

SET OF OBSERV.

GOAL

INTERPRE T con seq uences fo r curren t task, s afety , efficiency, et c.

What is the effect on this s ta te?

SYSTEM STATE

ULTIMATE

Wh ich is then th en t arget st ate ?

INTERPRE T consequ ences for cu rrent task , s afety, efficien cy , et c.

What is the effect on this s ta te?

Agent(s) for 3

Activity 3

PROCEDURE

EXECUTE Coordinate manipulations

ALERT

ACTIVATION Detection of need fo r data processing

PROCEDURE

EXECUTE Coo rdin ate man ip ulation s

Figure 74. Scheduling activity as a series of decision ladders.

Normally for Neil’s scheduling activity, the ultimate goal in the decision ladders comes from the bottom level (i.e., goals A) of the goal structure in Figure 71. How various goals at Melamed can be realised was tabulated in Chapter 5. To realise a goal, an objective was set, a scheduling policy was defined and operational steps were executed; these are equivalent to the ultimate goal, target state, task and procedure, respectively, in Rasmussen’s decision ladder. The relationship between a decision ladder, the goal structure and the abstraction hierarchy is shown in Figure 75. As Neil steps through the procedure he scans the 247

job tags for those that meet the desired constraints. For each job that he scans, he observes whether the feasible means-ends links (Figure 67) in the AH meet the current procedural requirements. From the subset that meet the requirements he selects a job, that is, he instantiates a particular means-ends chain (Figure 69). During the scanning process he sees patterns among job attributes that may trigger the consideration of other goals. In effect, he compares the AHs for many jobs, grouping them in a logical structural sequence (Figure 68). While Figure 74 shows serial switching between activities directed towards different goals, scanning for patterns that may trigger a new activity occurs concurrently with the search for jobs that meet the constraints of the current procedure. Therefore, the left side of the decision ladder (activities dealing with analysis) with an associative leap to the ‘ultimate goal’ (e.g., activity 2 in Figure 74) runs concurrently with the extant activity (e.g., activity 1). The right side of the ladder (planning and execution) is activated only when the scheduler decides to direct activity towards the new goal. Functi onal purpose

Maximise lo ng-term fina ncial return

Priority/ Values

Maximise long- term financial return

Maximise short-term financial viability

Purpose related

Process pa per to specified attributes

function

1D

Print Fold

3C In k Plates

4B

5B

6B

7B

8B

C ut

Cross Perforator Cross Perforator

paper

9B

2AS1

2AS2

2AS3

4A

5A

6A

7A

8A

9A

10A

11A

12A

13A

Fold

In k Plates

D

W

Q

Parts

SbS

FOB1

••• FOB4

BOB1

•••

BOB4

Paper

14A

Perforate C ut

Cross Perforator Cross Perforator

Print Fold

Ink Plates

Perforate C ut

Print Fix

Fold

Cross Perforator Cross Perforator

In k UV

Cyl. Cutter

paper Folder

speed

15A

EVA LUA TE p erfo rmance crit eria

ULTIMA TE GO AL Wh ich is th en then target s ta te?

IN TERPRET con sequ ences fo r cu rren t task , s afety , efficien cy, et c. Wh at i s the effect o n th is stat e?

SY STEM

TARGET

STATE

STA TE

DEFIN E TASK Select app ro p riate ch ang e o f syst. co nd .

Wh ich is th e app rop riate ch age in o peratin g co nd .?

W hat l ies bein d?

SET OF OBSERV.

TASK

H ow to d o it?

O BSERV E informatio n & data

FO RMULA TE PROCEDU RE plan seq uen ce of actio ns

Wh at’s go ing on ?

A LERT

ACTIV ATION Detectio n o f n eed for d ata pro ce ssin g

PROCED URE

EX ECUTE Co ord inate manip ulation s

Figure 75. The relationship between the goal structure, decision ladder and abstraction hierarchy.

248

Cross Perforator Cutter Folder

speed

AKIRA3

Wh ich g oal to cho o se ?

IDENTIFY presen t state o f the sy st em

C ut

Cross Perforator

paper Folder

2AS4

AMBIG UITY

Perforate

Cyl.

paper

AKIRA2

Plates

Cutter

Folder speed

AKIRA1

3A

dd

Plates available

Cyl.

speed

2A

Print

Cutter

device

3B

Perforate

Cyl.

Physic al

1A

Cust

To Plates

Physic al function

2C

2B

j obno

2D

1C

1B

Maximise repeat cust om

Job n

AKIRA4

During the search for patterns the scheduler is drawn, perhaps, to a nascent pattern between some jobs. He may then evaluate its potential by broadening the search. For example, his recognition of a couple of job tags for a particular customer might trigger a search for other jobs for that customer. Depending on the number of jobs for the customer and other patterns among their attributes (e.g., due dates and printing requirements), he may switch to a goal that includes “complete all jobs for a customer concurrently.” Sometimes, to strengthen the cueing signal for a particular informational pattern, Neil spatially arranges the tags to reflect the pattern. Two examples of spatial arrangement are: jobs for a particular customer that are awaiting allocation are grouped together; jobs requiring the same cylinder are grouped together in order of decreasing width.

6.2.2 Dynamics of Goal Setting and Attainment An activity analysis for scheduling needs to show the dynamic aspects of goal setting and attainment. There are clearly two phases in moving to a new goal. First, Neil evaluates whether he should direct his activity towards a new goal. If he perceives that the new activity is propitious, he may then enact it. Action Regulation Theory (Handlungsregulationstheorie), developed by German work psychologists, offers a theoretical framework for including these dynamics (Hacker, Volpert, and v. Cranach, 1982; Wulf and Rohde, 1998). It focuses on the representation of activities directed towards serial goals, which are in strings or hierarchies. While action-regulation researchers are interested in the relation of work between persons in an organisation, their conceptual base has congruency with Rasmussen’s theoretical constructs. Like CWA, Action Regulation theory sees activity to complete tasks in terms of recognition-action cycles directed towards meeting specific goals (Hacker, 1982, 1994). They have similar theoretical underpinnings; Hacker’s sensorimotor, conceptual and intellectual levels of regulation are equivalent to Rasmussen’s skill-, rule- and knowledgebased behaviour. At least one group of action-regulation theorists, namely Arnold and Roe (1987), have used Rasmussen’s SRK representation for the levels of regulation. While Rasmussen’s decision ladder has activity directed to a goal function at the top of the ladder, Action Regulation Theory uses intermediate subgoals of the primary goal (Hacker, 1986). For a given task there is an initial

249

situation demanding action. While a rational decision-maker knows his/her overall objectives (goals), the path of action for their accomplishment is not known. Therefore, a subgoal that seems to be nearer to the goal is sought. The problem therefore decomposes into sub-problems in a way that is more typical of job shop scheduling. Volpert (1982) describes the sub-problem as a cyclic unit (Figure 76) where the subgoal G is sought through successive transformations, T, of the state of the system. Each transformation moves the problem closer to resolution. The number of transformations may vary. There is a temporal organisation: first, the goal is set and then the series of transformations, from the start to the completing transformation are determined. Straight arcs represent this generating process. This is a planning process and is equivalent to ‘formulate procedure’ in Rasmussen’s decision ladder. The series of transformations are then performed in the given order, as indicated by the curved arcs: Rasmussen’s ‘execute’ activity. When the last transformation, T4, has been successfully completed the ascending arc, from T4 to G, closes the cycle. G

T1

T3

T2

T4

Figure 76. The cyclic unit (Volpert, 1982).

Where the activity requires sub-goaling, then there are multiple cyclic units: Each represents a sub-problem. They are in a hierarchical order shown in Figure 77. At the base of the superordinate unit are transformations that are in themselves goals for lower units.

250

Figure 77. The hierarchical-sequential organisation (Volpert, 1982).

While the representation of the sub-goals is hierarchical, the activities are sequential as shown by the arrows in Figure 77. At the beginning of the activity, only the form of the superordinate unit is determined. For example, if the goal is the maximisation of short-term financial viability, the superordinate unit may have transformations such as keep all presses and collators fully utilised. Each transformation will have a cyclic unit associated with it. The transformation, keep all presses fully utilised, is a goal which has transformations such as plan the production for the one-colour press until all time slots are productively utilised, then plan production for the four-colour press and so on. The transformation, plan the production for the one-colour press, is a goal that will have its own transformations. These may be: scan the available jobs for jobs requiring only one colour; select jobs from this set to be allocated to the press for the current time horizon; then arrange the order of their production.124 Likewise, arranging the

124

An example from daily life may further clarify the process. If the goal is to purchase some food at the supermarket, the superordinate unit may have transformations such as leave the house, drive to the supermarket, and buy the goods. Each transformation will have a cyclic unit associated with it. The transformation, buy the goods, is a goal which has transformations such as grab a trolley, walk the aisles, find the items, place the items in the trolley and then process at the checkout. Likewise, the checkout process consists of detailed transformations. In planning the supermarket trip, the person only sketches out the general transformations to meet the primary goal and fills out the steps for the sub-problems at the appropriate time. 251

order of production consists of detailed transformations. In planning how he would construct a schedule, Neil, in effect, only sketches out the general transformations to meet the primary goal and leaves the filling out of the steps for each sub-problem until the time that it is to become operative. minimise the number of set ups (2A)

G

T1

minimise the number of set ups on Akira 1

T3

T2

Akira 2

T4

Akira 3

Akira 4

Figure 78. The primary cyclic-unit for minimising the number of set ups on each press.

This model of goal-action cycles applies to the different levels of abstraction in the scheduling goal structure. If, for instance, the shop is heavily loaded Neil may aim to minimise the number of set ups (goal 2A in the goal structure shown in Figure 71). His first step, for argument sake, is to consider meeting the overall goal by seeking to meet these goals for each press in turn. Straight arcs in the primary cyclic unit shown in Figure 78 signify this plan of successive transformations. In executing the plan, he first focuses on Akira 1, and develops plans (shown by a hierarchy of cyclic units in Figure 79). The plan he devises is to first find all jobs that meet the technical constraints of the press and then select a sub-set of these that will minimise set-up time. To find jobs that meet the technical constraints, he first scans for jobs that meet the ancillary capabilities of the press (e.g., only Akira 3 can undertake ultraviolet fixing). After executing this plan (represented by the curved arcs), he then scans this subset to find which order to process the jobs so that there will be minimal change to the press’s configuration. The plan for this transformation consists of other transformations based on cylinder size, width of the paper and colours to be printed.

252

Minimise the number of set ups on each press (2A) G

Minimise the number of set ups on Akira 1 T1

T2

Find jobs that meet the technical constraints of the press

T3

T4

Minimise set-up time

Find jobs that meet the ancillary constraints

order jobs to meet minor set up constraints (2A3, 2A4)

Find jobs that meet the colour constraints

order jobs to meet width constraint

order jobs to meet colour constraint

Select jobs requiring current cylinder size (2AS1)

Figure 79. The hierarchical-sequential organisation of cyclic units for minimising the number of set ups on each press. The shaded triangles show the cyclic units that have formed when the scheduler’s focus is on Akira 1.

The procedural steps within the decision ladders map to transformations in the cyclic units at the base of the hierarchy in Figure 79. At the apex of each base cyclic unit is the goal for the relevant decision ladder. Each base unit represents an instantiation of a decision ladder. There are links from its goal to relevant goals within the goal structure. For the given example, Neil’s primary goal, presumed to be the minimisation of the number of set-ups, was the goal of the superordinate cyclic unit (the topmost apex). However, minimisation of set-ups is a transformation for goal 1B in the goal structure, “fully utilise all machines.” Similarly, goal 1B is a transformation for goal 1C, “maximise productivity,” and goal 1C is a transformation for goal

253

1D, which is a priority measure in the WDA. Therefore, potentially Neil could operate three levels above the goal that was his primary objective. If his focus could be shifted to a higher level, he may find other transformations (i.e., subgoals and strategies) that surpass the practice based on minimisation of the number of set-ups.

6.3

Summary

In this chapter, the work domain analysis (WDA) and activity analysis (AA) tools of Cognitive Work Analysis (CWA) were applied to the data from the field study. The currently available CWA tools were found to be inadequate for representing human decision-making processes in discrete-event systems. To describe scheduling behaviour as an act of navigating through a sea of multifarious goals, new tools were developed to extend the current formalisms. The intentionality of the work domain derives from the scheduler’s purposes and values. While the various links in the ends-means relationships in the abstraction hierarchy (AH), which represent the ‘requisite variety,’ are set by the causal constraints, the connection that is instantiated depends upon the detailed situational conditions and the scheduler’s immediate subjective preferences. When a job is allocated to a machine one of the means-ends chains becomes instantiated. Scheduling activity was shown to be the subject of AA and as such is not represented in WDA. The purpose of the system is to process paper to specified requirements. The purpose-related function changes with the job. As the configuration of the physical device must meet the purpose-related end, the AH is redesigned for each job as the purpose-related function changes. Representing the AH for each job as cards, moving from one card to another denotes the changeover of batches. Scheduling activity is represented within WDA as the ‘shuffling’ of cards. A series of decision ladders is required to represent the scheduler’s activities that are associated with a particular job. The machine loading board provides the cues and ‘memory’ for decision-making, by making visible the job attributes required for the information processing activities, which are identified in the decision ladder. 254

A relationship was shown between the decision ladder, the goal structure and the abstraction hierarchy. The various goal choices that may be instantiated at the top of the ladder are interrelated through the goal structure. A series of decision ladders portrays the scheduler’s activities as the goals change. While the scheduler serial switches between decision ladders directed towards different goals, the left side of the series of decision ladders runs concurrently with the extant activity. The right side of the ladder is activated only when the scheduler decides to direct activity towards the new goal. Just as the AH is a formal descriptor for WDA, which provides a generic framework for describing goal-oriented systems, the decision ladder is a formal descriptor for the activities in AA. It identifies the constraints on what needs to be done to a generic representation of the information processing activity when scheduler’s behaviour is skill-, rule- and knowledge-based. The dynamic aspects of goal setting and attainment can be represented using the cyclic units of Action Regulation Theory. The problem decomposes into subproblems. For each sub-problem there is a subgoal that is sought through successive transformations of the state of the system. First, the goal is set and then the series of transformations, from the start to the completing transformation are determined. The series of transformations are then performed in the given order. Multiple cyclic units are arranged in a hierarchical order where the problem consists of sub-problems. While the representation of the sub-goals is hierarchical, the activities are sequential. In this description of how a scheduler plans the construction of a schedule, the scheduler sketches out the general transformations to meet the primary goal and leaves the filling out of the steps for each subproblem until the time that it is to become operative. The findings from this chapter form the foundation for developing an architecture for an HIPSS in Chapter 7.

255

Chapter 7 Hybrid Human-Computer Intelligent Scheduling

In the preceding discussion it was shown that scheduling practice was very different from the traditional OR formulation, in which many unrealistic assumptions are made about operating conditions (e.g., the availability of machines, release time of jobs and changes in priorities). Human schedulers, operating in a perplex world of uncertainty and instability, can handle unexpected events. Where all pertinent knowledge is not articulated a priori, they exercise local knowledge and apply inductive logic to unfamiliar situations. The knowledge they used is not restricted to quantitative information (e.g., order size and due dates), but also includes subjective information about customers, operators and management priorities. Furthermore, they cope with information that is incomplete, ambiguous, biased, outdated, and erroneous (McKay, Safayeni and Buzacott, 1988). By readily recognising patterns in the data they can identify for the current environmental context what is, and what is not, essential (Sharit, 1984). If human and computer decision making were combined, then the human could handle the unstructured part of a scheduling problem, while the computer supports the human for the more structured parts of the problem. This chapter therefore begins with a discussion on interactive human-computer scheduling systems. The roles and functions that designers of interactive systems assign to humans and computers are examined. Through a discussion of the inherent weaknesses of the decision-making architecture underlying these systems, an alternative architecture is proposed that locates the scheduler centrally in the decision-making process. Using the findings from the Cognitive Work Analysis (CWA) of scheduling at Melamed undertaken in Chapter 6, the way a hybrid intelligent production

256

scheduling system (HIPSS) can be designed is then demonstrated. Finally, the methodological contribution of an HIPSS to scheduling practice is delineated.

7.1

Interactive Scheduling

Ferguson and Jones (1969) developed the first interactive human-computer system for job scheduling.125 Their model included a set of operation rules that helped the user to investigate alternative action. In developing schedules, experimental subjects had to consider short-term operating decisions on capacity (i.e, the use of overtime) and acceptance or rejection of orders with specific promised dates. Most significantly, their research was predicated on the proposition that in manufacturing domains where there is a multiplicity of operating objectives with complex trade-offs between objectives, mathematical optimality of relatively simple criteria is of limited use. They postulated that schedulers acting under these conditions frequently change performance measures and alter the coefficients in criteria that are weighted functions. Jackson and Browne (1989) argue that although interactive scheduling systems will seldom give optimal solutions, they can model a given manufacturing system more closely than a schedule produced automatically, since the user has played an explicit part in its development. Interactive scheduling ranges from the human producing a schedule, with advice from the computer, to the computer producing a schedule with the human providing some correction and adjustment (Sanderson, 1989). A common approach is to use a computer to build alternative schedules using various dispatching rules. The computer first builds a schedule, using standard OR heuristics, or their knowledge-based equivalents, and then displays it as a Gantt chart. Depending upon the system, the human either adjusts the chart or chooses a schedule from alternatives built using different heuristics (Nakamura and Salvendy, 1987; Hwang and Salvendy, 1983, 1988; Tabe and Salvendy, 1988; Tabe, Yamamuro and Salvendy, 1988; Nakamura, Shin, and Salvendy, 1991; Bauer et al., 1991; 1994). Moves that violate hard scheduling constraints (e.g.,

125

Tabe and Salvendy (1988) made this claim.

257

operation precedence) are normally barred. Frequently, a rebuild-algorithm then reschedules activities that temporally follow these manual changes. The interplay between human and computer continues until a satisfactory schedule evolves. Often future outcomes of a schedule may be viewed using a predictor that steps through forthcoming events (Nakamura, 1990; Nakamura and Salvendy, 1988). While there is conjecture over some of the findings of these studies, there is general agreement that humans make use of predictions about future states (Smith and Crabtree, 1975; Tabe and Salvendy, 1988; Nakamura and Salvendy, 1987). For a comprehensive discussion of these studies see the excellent and detailed review of the literature by Sanderson (1989). The approach of these studies was that of minimising or maximising standard OR performance measures. For example, in the studies by Tabe and Salvendy (1988) and Tabe, Yamamuro, and Salvendy (1988), subjects were asked to minimise maximum and average tardiness and maximise machine utilisation. Interactive system developers and researchers contend that such systems can be truly generic (Jackson and Browne, 1989; Morton and Pentico, 1993). Instead of tailor-made algorithms, the computer can apply standard heuristics. In an example given by Haider, Moodie and Buck (1981), a Gantt chart is first generated by the slack-per-remaining-operation heuristic and then the human modifies the schedule by selectively using SPT to clear jobs with small processing times. Their experimental subjects could also force small idle times within the schedule in anticipation of urgent jobs. Noting that the dispatching heuristic was insensitive to these opportunities, they see human schedulers providing the means for switching opportunistically between policies. To decide which policy to apply, the subjects could observe the moment-by-moment configuration of the system and the currently feasible alternatives. Furthermore, interactive system developers argue that human schedulers can make unrealistic schedules practical by applying domain knowledge, which may be structured or unstructured (see Figure 80).

258

Structured Inputs

Problem

Unstructured Inputs Past Experience

Graphical Representation U Heuristics and Algorithms

S E

Intelligence

R Current

Relational Database

Knowledge

Figure 80. The two types of inputs in the interactive scheduling task (Bauer et al., 1991).

While Bauer et al. (1991) locate the user in a central position in their task diagram for interactive scheduling (Figure 80), this representation belies the real relationship in their system; the user only amends the schedule that has been automatically generated. In this formulation of interactive scheduling, instead of being the architect of decisions, the human’s role is subsidiary. Perusal of the literature shows this is the norm. Nakamura and Salvendy (1994) present the case for decision making to be a cooperative venture between human and computer. Their argument rests on the special abilities of humans to consider several factors concurrently and deliberate on multiple goals. In addition, as humans can recognise conflicts between goals and then decide how to resolve them, they can base their decisions on realistic criteria. In Chapter 2, their classification of the different modes of human-computer interactive planning and scheduling was introduced using the diagram replicated in Figure 81.126 In their definition of hybrid intelligent systems, Nakamura and Salvendy state that both the computer and human put forward solutions. While they may claim that there is a partnership between human and computer, in their

126

Although Nakamura and Salvendy refer to both planning and scheduling, to avoid clumsy phrases, the term ‘scheduling’ in the discussion of their work implies both planning and scheduling.

259

architecture the computer has primacy. For the three different levels of operation that they delineate, humans merely respond to the computer’s decisions. For the most active role delegated to them, humans select and implement one schedule from a set offered by the computer. At the next level, the computer suggests a schedule and carries it out if the human approves. Under the most limiting condition, the human is outside the loop; the computer chooses and implements a schedule and only informs its supposed partner after the fact. Having delineated the possible levels of operation, Nakamura and Salvendy give an example of a decision-aiding system in which the scheduler modifies the schedule by imposing constraints or using different priority rules. A new schedule is then generated until an acceptable one is obtained. Computer

Human • filters any poor machine solution

• offers a solution • replies to a question ---------------------------• explanation • consultation

(A) Algorithm and Knowledge-based System Computer

Human • offers a solution • evaluates a solution

• displays the results of data processing ---------------------------• look-ahead (B) Manual System Computer

Human • offers a solution • evaluates a solution • modifies a solution

• offers a solution • displays the results of data processing

(C) Hybrid Intelligent System

Figure 81. Human-Computer Interactive Modes (Nakamura and Salvendy, 1994).

Instead of being a decision-maker within the decision loop, Nakamura and Salvendy see the human’s primary role is to meet performance objectives by managing knowledge resources. That is, the human functions as a supervisory controller of an automated decision system. Humans are assigned a similar supervisory position in studies at the Georgia Institute of Technology. The series 260

of studies on the GT-FMS (Georgia Tech Flexible Manufacturing System) are particularly noteworthy as they were the first to use Rasmussen’s decision ladder for activity analysis in human-computer scheduling (Dunkler et al., 1988; Hettenbach et al., 1991; Mitchell et al., 1991). Schedules were automatically constructed using a heuristic127 that combined SPT and EDD. To control the build of the schedule, experimental subjects could either change the weighting factor in the heuristic or expedite jobs directly. They were asked to maximise a profit function, which was a sum of the value of the throughput and penalty costs for tardiness. The objective of the research was to observe the subjects acting as supervisory controllers who use performance history to tune the weighting factor. The researchers were surprised to find that the subjects became involved in building the schedule, instead of leaving the schedule construction to the computer. They found performance history and trend information were not factors in the subjects’ decisions: instead, they preferred to evaluate the effects on tardiness and throughput from the current system status.128 While the subjects indicated that they could have performed better if they had more information on the system’s status, they did not request any more trend information, such as past scoring histories and score-to-date. Although the subjects were asked to maximise profits, they tended to break the problem down into sub-problems, seemingly in concordance with Action Regulation theory described in Chapter 6.129 Their

127

The Weighted Operation Priority Index, WOPI, is a weighted linear combination of SPT and EDD, where the weighting factor, α, is used to set the relative dominance of SPT and EDD. How it was calculated is a mystery as SPT and EDD are both means for ranking jobs. Perhaps, WOPI j = α × pSPT j + (1 − α ) × pEDD j where pSPT j is the ranked position of job j when available jobs are SPT ordered and p EDD j is the ranked position of job j when available jobs are EDD ordered. 128

They had to estimate starting time, processing time and completion time without the aid of any temporal display such as a Gantt chart. 129

Hettenbach et al. (1991) observed that of the eight subjects, three focussed on avoiding all penalty costs and one endeavoured to minimise the number of tardy 261

immediate objective also changed with system constraints. For example, when the “end of session is near” alarm was activated some subjects immediately modified their behaviour. They stopped evaluating late jobs and repeatedly expedited jobs that had operations with short processing times.

Planning knowledge source module

BLACKBOARD

Schedule Generator

Executing knowledge source module

Compensation knowledge source Status updating module

Monitoring knowledge source module Interface

Formulation knowledge source

Event list

Interface

Identification knowledge source

Input module

Database

Figure 82. Architecture of Human Performance Model (Nakamura and Salvendy, 1994).

Nakamura and Salvendy’s hybrid intelligent model combines the features of the human, the OR model and the AI model. The components of their decision-aiding system for an FMS (Flexible Manufacturing System) are a human performance model, an interface, a human scheduler, a schedule generator, and an FMS model. Declaring that the human performance model, shown in Figure 82, is similar to Rasmussen’s decision ladder, they claim that it captures the scheduling behaviour of the human scheduler using a variety of knowledge sources that include heuristic algorithms, optimising procedures, and rule-based procedures. However, they provide no evidence that these algorithms and procedures are based on the information processing activity of a human scheduler. The human’s position in the decision-making process of this scheduling system is no different to other interactive scheduling systems. Knowledge acquisition and maintenance of the knowledge bases are problematic, as discussed in Chapter 2. When developing their expert system for production

jobs. Completing parts (i.e., throughput) was a secondary goal. Unfortunately, the dynamics of changing from one sub-problem to another were not reported.

262

scheduling discussed in Chapter 3, Kerr and Ebsary (1988) found that the knowledge base of the scheduler was continually changing. The inference they draw is that under such circumstances, acquisition of the scheduler’s knowledge must be continual, for the system to reach the performance level of a human scheduler. Capturing the knowledge of the domain expert is a process that is fraught with difficulties, especially where knowledge is deep and tacitly held. Consequently, the policies of decision-makers often contain important information that may not be explicitly incorporated in a model (Hoch and Schkade, 1996). Even if continual updating of the knowledge base were possible, automated decision-making may not be practicable as humans are generally unwilling to relinquish control to a model (Kleinmuntz, 1990; Hoch and Schkade, 1996). The experiments on the GT-FMS support this contention; the subjects did not keep to their assigned role of supervisory controller, but became actively involved in ongoing decision making. Their desire to be active participants in the decision-making process may be due, in part, to Bainbridge’s (1983) contention that persons are disinclined to trust decisions that vary from their own, where they cannot understand the methods and criteria used. There is another compelling argument for maintaining active human decisionmakers. Where either the relationship between the predictor and the dependent variable changes or new variables become important, a decision model may lose its predictive power. Humans can provide added predictive value, as experts are better than models at adapting to changing environmental conditions (Blattberg and Hoch, 1990). For the above reasons, Hock and Schkade point out that it may be dangerous to go on autopilot with a completely automated system. Instead they argue that combining human and computer decision making often outperforms either acting alone. The difficulties with modelling depend upon the degree of uncertainty in the manufacturing environment. In Wiers and McKay’s (1996) two-dimensional typology of workshops, shown in Table 28, the critical type of shop for interactive scheduling is the sociotechnical shop in which there is substantial uncertainty in information and execution. Under these circumstances the scheduling problem is ill defined. Consequently, embedding, a priori, the necessary flexibility into the

263

system to identify the problem state and offer an appropriate solution is not possible. Table 28. A typology of production systems (Wiers and McKay, 1996).

No uncertainty

Uncertainty

No human recovery

Smooth shop

Stress shop

Human recovery

Social shop

Sociotechnical shop

To overcome the difficulties of ‘smart’ systems in situations that exhibit uncertainty, the smart features must be implemented in a way that effectively and efficiently complements human tasks (Wiers and McKay, 1996). McKay et al. (1989a) adopted an integrated approach to scheduling, combining OR methods, an AI expert system, constrained oriented search and a human scheduler. The elements of their system are a knowledge base, an expert system, a schedule generator, a domain manager, a schedule modification system and a human scheduler. Uncertainty data, relaxation rules and ‘goodness’ rules are in the knowledge base. The domain manager contains objectives, constraints, goals and measures of schedule quality. As the scheduler enhances and tunes the schedule, the schedule modification system ‘learns’. The scheduler refines the schedule and provides feedback about the current situation to the schedule modification system. The schedule modification system captures human scheduling behaviour: though the method of knowledge capture is not elucidated in the paper. Aspects of this system that bears on an HIPSS are the identification of patterns and clues, and the provision of supplementary direction for constraint relaxation. Wiers and McKay (1996) argue that an ISS (Intelligent Scheduling System) needs to be appropriately designed to help identify patterns, predict future problems and recommend possible solutions. An interactive scheduling system that could be applied to the type of environment found in the field study at Melamed must work with multiple and conflicting objectives. Belton and Elder (1996) developed an interactive scheduling system that uses a multiple-criteria heuristic to sequence jobs on a bottleneck machine.

264

They saw the need for an interactive approach in situations where there were conflicting objectives. Their intention was to develop a system in which the scheduler could control schedule performance across multiple measures, by manipulating the weights for the parameters in the heuristic. To find a relationship between six weights and ten performance measures, they systematically changed weights and observed the effects on performance. Their search was in vain; a result that should have been anticipated from a cursory browse of the literature on multiple-criteria heuristics applied to scheduling. How can schedulers, operating under ill-defined conditions in real plants, be expected to correlate a balance between weights and a desired equilibrium across performance measures, where rigorous experimental procedure applied to well-defined data could not show how the weights correlate with the performance measures? These findings support the contention made in Chapter 3 that a single criterion in a composite measure obfuscates the effects of scheduling activities on each element of the measure. Their method is the same as that used for supervisory control of the GT-FMS, and therefore shares the same shortcomings relating to active involvement in ongoing decisions. At the 1991 SIGMAN130 workshop, leading AI researchers, who specialised in scheduling, held a roundtable discussion on interactive scheduling (Kempf et al., 1991). There was some consensus with regard to the following benefits. Interactive scheduling: 1. Allows managers to pursue goals and enforce constraints that cannot or have not been given an accurate computational representation or that change rapidly over time; 2. Allows humans to build schedules by methods that they use naturally, but are hard to represent as algorithms; 3. Allows humans to guide searches in directions that they might intuitively follow;

130

The Special Interest Group in Manufacturing of the American Association for Artificial Intelligence.

265

4. May be used to educate the domain experts about schedules and scheduling methods, and to negotiate with production managers and clients about release dates and due dates. While developers may anticipate these benefits, they are difficult to attain for well-behaved manufacturing environments and the likelihood of their realisation decreases as uncertainty increases. A major reason for failure advanced at the workshop concerns acquiring knowledge of the domain constraints and the methods the human schedulers apply. It is difficult to represent the scheduling process in a computer in situations where schedulers make no use of any explicit application of a clearly defined model of the scheduling process, but instead tacitly apply extensive knowledge that they have accumulated from years of experience. The methods that the scheduler applies may not translate into either expert system rules or data structures and algorithms. A critical difficulty is that the domain expert, a scheduler, who has a quite different background to the knowledge engineer, may not understand the constraint representation that the developers build into the interactive system. The SIGMAN report warns that an interactive scheduling system may not find acceptance on the shopfloor if it fails to exactly emulate the behaviour and performance of the domain expert. It also emphasises the failure of systems to adapt to the great variety of circumstances encountered on the shop floor. Noting that many systems that have been successfully deployed contain only a small amount of AI, the report asserts that success depends on many other issues such as the interface design, database connections, real-time data collection, and so on. McKay and Buzacott (1999) state that a good decision making process can identify decisions that need to be made, understand what the constraints are, and know how they interrelate and how they can be manipulated. It knows the impact of the manipulation of the constraints, and it can identify and use appropriate solution methods, and recognise when the best solution has been reached. When rescheduling because of an unexpected event, some work should be pushed away from the impact zone if at all possible, batches may be split, and so forth. They point out that in the real world, constraints have an elasticity that can be exploited when reacting to the unforeseen.

266

McKay (1997) developed a custom-built Production Planning System (PPS) tool that has special functions and interfaces that help schedulers in context-sensitive situations of the type discussed in Chapter 3. Schedulers can enter and adjust information that is not on the main computer system. They can move work between lines at will, change/override every characteristic about the work, and can overload some secondary resources. The only exception is that the hard constraint — only a single job can be on a machine at a time — cannot be violated. To match work expectations, the system allows them to see the impact of a schedule on finite capacity. They can then split jobs, book overtime, etc. as needed to make up any shortfall via the resources that have some elasticity. Schedulers can dynamically adjust objectives without the system creating too tight a plan that would require too much alteration. The system has functions that allow schedulers to align jobs with the next shift, to highlight jobs close to a shift boundary, and to split jobs and to separate the setup from the processing. For humans to be actively involved in decision making, the computer system needs to support their decision-making processes. Humans reflect upon the characteristics, that is, the attributes, of jobs and the calls that these make upon the shop. The job attributes, and patterns among attributes across jobs, act as stimuli. Schedulers, relating job characteristics to the state of the working environment, use their deep knowledge of the domain to draw inferences about possible scheduling strategies. In this reflection schedulers may consider, for example, the following issues raised by McKay et al. (1989a): Which goals currently have high priorities for management? Which jobs need immediate attention? Which jobs may be left until later? Which jobs are likely to cause problems? Which processes are difficult to repeat or set up? What material or quality issues will arise?

7.2

Drawing the Threads of the Discussion Together

Interactive scheduling tools are commonly designed so that the computer first builds a schedule, which it then displays to the scheduler as a Gantt chart. Normally the objective is to construct a schedule that minimises or maximises standard OR performance measures. The human can then make minor

267

improvements to the schedule. Sometimes the system is so restrictive the scheduler can only select subsets of jobs and then apply a ‘canned’ heuristic. Another approach to interactive scheduling applies to the construction of schedules that satisfy multiple goals. Humans acting as supervisory controllers tune the weights in a multiple-criteria heuristic that the computer uses to schedule jobs. It was found that experimental subjects, not content to merely tune heuristics, became actively involved in ongoing decisions. In addition, they tended to move the schedule towards the set goals by meeting subgoals. Hence, the use of multiple-criteria heuristics to guide schedule construction towards the satisfaction of multiple goals is questionable. Subjects also tended to modify objectives with changing circumstances. In real plants that are bounded by numerous constraints, schedulers apply wideranging knowledge — structured and unstructured — to build schedules that satisfy many objectives. If a scheduler, operating under such conditions, were to use an interactive scheduling system that applies OR heuristics to all available jobs, the schedules constructed by the computer may be quite unrealistic. The scheduler would then have to make extensive alterations to obtain schedules that were serviceable. McKay and Wiers (1997) point out that schedulers often have to urgently adjust schedules to solve problems that arise. They do not have the time to correct problems due to infeasible schedules generated by badly flawed algorithms. The use of smart systems that capture the schedulers’ knowledge and policies, however, may not overcome problems with interactive scheduling systems, especially in environments in which the schedulers’ knowledge is either continually changing or difficult to extract. Where there is uncertainty, interactive scheduling systems should help the scheduler identify the current and potential system states by showing patterns and providing clues. To enable schedulers to develop realistic schedules that cope with contextual factors, a scheduling system must allow them to pursue goals and enforce constraints not represented in the system, as well as relaxing and modifying those constraints that are represented. To provide elasticity and hence robustness, the system should allow them to include idle times at advantageous positions within a sequence.

268

As schedulers are inclined to mistrust decisions they do not understand, they are reluctant to use a system that does not make sense to them. The lesson learnt from the designers of DSSs (Decision Support Systems) — a field in which there has been two decades of experience — is that data need to be transformed into a form that suits users’ purposes. The representations and models need to fit naturally into the user’s individual perspective of the decision-making environment (Kerr, 1991). The emphasis in DSSs is on the semantics of data and the flexible retrieval of data that meets the needs of users, and not on complex mathematical models, which are incomprehensible to the user. The emphasis is on ‘responsibility.’ Ultimate responsibility lies with a person, not a machine. Schedulers will not wholeheartedly follow a schedule, unless they believe that it will perform to their satisfaction. If they cannot understand the methods and criteria that the scheduling tool uses in reaching a decision, they are disinclined to rely on that decision if it varies from their own. Sharit, Eberts and Salvendy (1988) warn that for users to maintain responsibility, they must understand the procedures the DSS applies (e.g., algorithms). Not only do users of a scheduling tool need to understand what the computer does, but also they should be able to use methods that they find natural, including those methods that are difficult to represent as algorithms. To not restrict their decision-making processes, the tool should allow them to search for feasible schedules in any direction that they might intuitively follow. The critical issue, raised in Chapter 3, is that the locus in solving scheduling problems needs to be shifted from algorithms to the persons who have to take responsibility for the planning of production. While there are many affirmations by developers and researchers in the field that human-computer cooperative decisions are central to interactive scheduling, the practice has been found wanting.

7.3

Hybrid Intelligent Production Scheduling System

From the above discussion, interactive scheduling systems that apply simple heuristics to a simple model of the manufacturing system are clearly unsuitable

269

where conditions are perplexing. Schedules constructed by them would need such heavy amendment that there is little, if any advantage, of their production. Another approach to scheduling was advanced in Chapter 3. Instead of heavily relaxing constraints, schedulers construct schedules in perplex environments by seeking patterns in the data, while navigating through a sea of constraints. Where necessary, to attain feasible solutions they lightly relax constraints.131 Wiers (1997) characterises constraints by their influence on the sense of urgency. Constraints that endanger attainment of the important goals are critical. To prevent or minimise the violation of these goals, the scheduler wants to be absolutely sure that this work is scheduled in a specific way. Jobs that have tight temporal constraints, for example, slack time close to the flowtime, may be scheduled first and the rest of the work scheduled around it. The presence of multiple constraints magnifies perplexity, as the information sources may not be stable. Such jobs may be scheduled for a period where problems that arise can be attended to with the appropriate level of expertise: for instance, they may not be scheduled for the night shift. The scheduler will give particular attention to jobs that have stochastic constraints. For example, from experience the scheduler may realise that the manufacturing time for a particular type of job may vary considerably because the equipment needs to be frequently readjusted to maintain quality. Schedulers, Wiers contends, will give special attention to this type of work, because of the likelihood that it may become critically constrained.

131

As all factors affecting feasibility are taken as constraints, due dates and release times are considered to be constraints. Therefore, when a job becomes tardy, the constraint on its due date is relaxed.

270

Scheduling Rules

Knowledge-Based Adviser

GANTT CHART

JOBS SCREENS

Timing at resources Performance prediction

Unassigned Sequence Job attributes Machine 1 Sequence Job attributes

Machine n Sequence Job attributes

HUMAN DECISION MAKING Context Setting Pattern Recognition

Figure 83. Interface elements and their location in a hybrid intelligent production scheduling system (HIPSS).

In situations where schedulers seek to satisfy many goals within perplex environments,132 they need to be at the core of the decision-making process, instead of merely altering schedules built by a computer. A distinction can be drawn between other interactive systems and a hybrid intelligent system in which decision-making is collaborative. An HIPSS is designed so the human expert and the computer interact during the build of a schedule. By participating in schedule construction, schedulers are not merely relegated to a reactive role in the decisionmaking process. Unlike other interactive systems, an HIPSS allows users a broad degree of action. It preserves the schedulers’ initiative to evaluate situations and to make decisions. By placing the human centrally in the decision-making process, an architecture can be developed for an HIPSS, shown in Figure 83, that is not tightly bound to a restrictive perspective of the problem. The focus of the development of an HIPSS should be on what the scheduler is doing with the computer and not what the computer is doing. Schedulers should feel engaged in scheduling activity, and not engaged in managing a computer. This architecture allows schedulers to wield the HIPSS as a tool. A tool perspective gets away from

132

Perplexity is defined and discussed at length in Chapter 3.

271

what Woods and Roth call the machine-as-prosthesis paradigm, which focuses on the human acting as an intermediary between the computer expert and the environment. To be a proficient user of this cognitive tool, a scheduler has to completely understand its functions and behaviour (Woods and Roth, 1988). In particular, its reactions have to be self-explanatory and adapted to the actual working situation. In interaction it is extremely important that the schedulers can perceive the connection between their own intentions, or actions, and the effects produced by them. This composition allows the scheduler to cope with unanticipated variability, thereby avoiding the brittleness of response discussed in Chapter 3. Proper structural relationships defined by the architecture are insufficient by themselves. The interaction process must cope with the ways that humans address the scheduling problem. The rules applied by the computer need to be apparent to the persons using it. Using their intuition and knowledge, users can then guide the search for a schedule in directions that they would like to follow. In addition, they can recognise aspects in a situation or constraints on a problem solution that deviate from the computer’s assumptions and expectations (Roth and Woods, 1989). While not interfering with the human’s ‘normal’ way of thinking, the computer system should extend the human’s abilities through the application of suitable abstractions (Sage, 1987). For humans to play a coherent and active role in schedule construction, they must have ready access to all the information they use to make decisions. To assist the schedulers’ decision-making activities, the computer should display all related information held in the database, in a form that helps them to visualise abstract relations and to experiment opportunistically with possible scheduling strategies (Woods and Roth, 1988; Sanderson, 1989). Using information that is displayed and other domain knowledge, schedulers can seek patterns in the data on which to draw inferences about possible scheduling strategies. Clearly, a Gantt chart, usually the primary interface in interactive systems, does not show the requisite information. It focuses on manufacturing attributes (i.e., loading, processing and completion times) and only shows partial information about the jobs themselves — commonly, job numbers and customer names, and sometimes the due dates. Indeed, a Gantt chart should be seen as the product of a decision-making process;

272

it is the plan for running the shop. While its form may well suit the exposition of a plan, it is not suitable for an HIPSS. The difference is communication as against discovery. A Gantt chart’s purpose is to communicate the plan of work to those people who have to use it. The imperative is simplicity (Bertin, 1981). Whereas, a display for decision making must comprehensively show all data a scheduler may use to discover relationships. It has to reveal all the relationships formed by the interplay of the data. Therefore, a Gantt chart should not be the principal means for humans and computers to interact. Functi onal Maximise long-term fina ncial return

purpose

Priority/ Values

Maximise long- term financial return

Maximise s hort-term financial vi ability

Purpose related

Process paper to specified attributes

function

1D

Print Fold

3C In k Plates

4B

5B

6B

7B

8B

C ut

Cross Perforator Cross Perforator

2AS1

2AS2

2AS3

4A

5A

6A

7A

8A

9A

10A

11A

12A

13A

Fold

In k Plates

D

W

Q

P arts

SbS

FOB1

••• FOB 4

BOB1

•••

BOB 4

P aper

14A

Perforate

Print

Perforate

C ut

Fold

C ut

Cross Perforator Cross Perforator

Ink Plates

Print Fix

Fold

Cross Perforator Cross Perforator

In k UV

Cyl. Cutter

paper Folder

Perforate C ut

Cross Perforator Cross Perforator

Cyl. Cutter paper Folder

speed

AKIRA2

Plates

Cutter paper

Folder speed

AKIRA1

3A

dd

Plates available

Cyl.

paper

9B

speed

2A

Print

Cutter

device

3B

Perforate

Cyl.

Physic al

1A

Cust

To P lates

Physic al function

2C

2B

j obno

2D

1C

1B

Maximise repeat cust om

Job n

Folder speed

AKIRA3

AKIRA4

15A

2AS4 EV ALU ATE p erfo rman ce crit eria W hich go al to cho o se? AMBIG UITY

U LTIMATE GO A L Wh ich is then then targ et s tate?

IN TERPRET co n seq u ences fo r current task, s afety , efficien cy , et c. Wh at i s the effec t on this stat e?

SY STEM STA TE

TA RG ET STATE

IDENTIFY p resen t sta te of th e sy st em

DEFINE TA SK Select app rop riate ch an g e o f syst. co n d.

Wh ich is th e ap p rop riate ch ag e in o perating con d .?

W hat l ies b ein d ?

SET OF OBSERV.

TASK

H ow to d o it?

OBSERV E info rmation & d ata

FO RMU LATE PROCEDU RE plan seq u en c e o f actio ns

Wh at’s g oin g o n ?

ALERT

A CTIVA TION D etec tion of n eed for d ata p roce ssin g

PROCED U RE

EX ECUTE Co ord inate manip u latio ns

Figure 84. The relationship between the goal structure, decision ladder and abstraction hierarchy.

An HIPSS that gives schedulers ready access to all the information they use to make decisions must support the relationship between the goal structure, decision ladder and abstraction hierarchy shown in Figure 84, which is developed through a CWA of the schedulers operating within the scheduling domain. The way an HIPSS can be designed to reveal this information can be demonstrated with a

273

prototype HIPSS (ProtoHIPSS)133, which was developed for the environment at Melamed. A discussion on the relationship between the ProtoHIPSS and CWA requires some understanding of the features of the ProtoHIPSS and the rationale for their form. The system therefore is described first and then related to CWA: yet, in designing an HIPSS, CWA should inform the design.

7.3.1 Features in the ProtoHIPSS If the ProtoHIPSS were to be used by Neil, it should allow him to follow practices that produce schedules that are, at least, acceptable, especially since there is no other singularly successful scheduling technique. It must support his use of job attributes. The machine-loading board provides a good starting point for the design of the ProtoHIPSS. It records the order that jobs will be loaded on the presses, and it allows Neil to observe job attributes. An ethnographic design methodology, in which Neil’s usage of the machine loading board is replicated in the new system without question, is not being put forward here. A new system should discourage practices that are obviously detrimental to the production of a good schedule, while encouraging new proficient practices. Nevertheless, a

133

As a prototype, it embodies the essential features of an HIPSS. The prefix, proto, also conveys the meaning that this is the earliest primitive form of an HIPSS and therefore should not be considered a fully developed implementation. The ProtoHIPSS is a fully functional KappaPC application consisting of 650 kbyte (about 15, 000 lines) of source code. KappaPC is a high-level objectoriented modelling system, developed by IntelliCorp, which includes an inference engine and object classes for the Microsoft Windows widgets. Being a research tool developed without any funding, it is does not meet commercial standards of quality and stability. After viewing a demonstration of the application, Neil and the general manager petitioned more than once for its implementation, despite being advised of its development status. There has been some commercial interest in developing an HIPSS; this will be explored in the near future.

274

system interface that feels somehow familiar avoids a major hurdle, user resistance. The value offered by the technology has to be great enough for schedulers to reconfigure themselves and their work so they can take advantage of it (Miller, Sullivan and Tyler, 1991). It should extend users so they may restructure their view of the problem. To enhance Neil’s ability to cope with unanticipated variability, the system has to be flexible (Woods and Roth, 1988). The aim therefore is to allow schedulers to build upon their current practices. When scheduling manually, Neil places the tags for the new jobs in the plates section of the machine-loading board, which is used as a marshalling area for unallocated jobs as explained in Chapter 5. On observing the current state of the schedule and the characteristics of available jobs he assigns and reassigns jobs. Frequently, in seeing a common pattern in the attributes for some jobs, he may group them in the unallocated space. He may further arrange them in some desired order based on another patterned relationship, before placing them as a group under the desired press heading. Some jobs may be allocated to a machine before the plates have been received. Only critical jobs are so treated; for example, jobs with impending due dates, unusually long processing times, or those that severely disturb press allocation. While assigning currently unallocated jobs to the presses, he looks at the current allocation and ordering of jobs at the presses. He may append a job, or a string of jobs, to the current sequence, or he may slot it between the already allocated tags. As he builds a new sequence for a press, jobs previously allocated may no longer fit the current strategy. They may be shifted back to the unallocated section of the board or be allocated to another press. The ProtoHIPSS was designed to equip the schedulers with the means to access the same values and to take similar actions.

275

Jobs Window

Gantt Chart Window

Figure 85. Overview of the Shop during schedule construction showing the Job Windows for the four presses and the unallocated jobs.

Two types of display — the Jobs Windows and the Gantt Chart — allow schedulers to interact with the ProtoHIPSS. There is a separate Jobs Window for each machine and one for jobs yet to be allocated to a machine. To find the basis for grouping jobs, schedulers can search for patterns among jobs. Central to the interactive decision making are the Jobs Windows, which display the attributes of the available jobs. While building the schedule, users of the ProtoHIPSS can see the current state of the schedule on the Gantt chart.134 The Gantt chart shows the expected loading, set-up, processing and completion times. Users can horizontally scroll the chart to view off-screen data; by changing the scaling, they can zoom in to obtain greater clarity or zoom out to view the schedule across multiple shifts. In

134

Schedulers can view both processing and set-up times in the Gantt Chart, although it is not obvious in Figure 85 as the colour difference appears only as a slight variation of grey.

276

the Job Windows, schedulers can see jobs ranked in order of processing. The primary screen allows them to examine the state of the schedule and the attributes for all jobs, assigned and unassigned, as all the windows are tiled as shown in Figure 85. Using the Job Windows, schedulers can choose classificatory strategies for forming groups and then place them into machine queues. In an opportunistic way, they can try various groupings, make amendments and backtrack on previous decisions.135 The formation of groups depends upon patterns the user sees among the signs. Signification emerges as the user observes a complete screen of images. The patterns uncover data structures, from which schedulers can draw inferences (Tukey, 1977; Sibley, 1988; Bertin, 1981). In ordering the jobs within a string, they can apply various policies from a suite of operations research (OR) scheduling heuristics (see Figure 82).136 A knowledge-based adviser warns the user when soft constraints are infringed, but disallows violation of hard constraints.137 They can also find more information associated with any object on the screen, by right clicking it with the mouse. The product of this interactive process is a Gantt chart. The primary function of the Job Windows is to help schedulers seek patterns on which to classify jobs and suitably group them on the presses. Therefore, the

135

‘Undo’ and ‘redo’ features allow schedulers to move freely within the decision space, thereby making it easy for them to compare different scheduling strategies and to experiment with news strategies. 136

The current implementation includes SPT, EDD and R&M for ordering a string of jobs and parallel versions of SPT and R&M for allocating a string of unallocated jobs to the machines. 137

The current implementation has about 700 logical IF-THEN rules within object methods and twelve expert rules under the control of the inference engine. The latter manage the indication of constraint violation associated with width. About 30% are associated with constraint management and scheduling heuristics, and the rest control the display parameters.

277

ProtoHIPSS has to bring ‘visibility’ to the classificatory patterns in the attributes. The job tags used on the machine loading board at Melamed depict the job attributes as abstract signs that signify the characteristics of the printed forms (Morris, 1946; Polanyi, 1962).138 At the time a schedule is constructed, the jobs do not have a physical presence; all schedulers have before them are signs.139 In the case of the machine loading board, the signs are alphanumeric characters on the job tags, which signify the physical properties of the part to be manufactured. The job tag is therefore a symbolic object that denotes the final form of the product.140 Table 29. Legend for the labels in Figure 86. Label

Description

1

Horizontal line links all the elements of a JSO into a unified image.

2

Horizontal position of the vertical bar signifies cylinder size. The height of the bar signifies paper depth.

3

The cylinder size is the same as label 2, but the depth of paper is a third.

4

Signifies an increase in width between jobs.

5&6

The bars (labelled 5) map to the press icons labelled 6, with the current choice indicated by the tallest bar.

7

The length of the thick bar depicts the processing time.

8

The length of the thin bar depicts the set-up time.

9 & 10 11

The horizontal position of the vertical line indicates which colour is to be printed. The bar touching the baseline signifies that the colour is to be printed on the front of the bill, whereas the position shown for label 9 indicates the colour is to be printed on the back of the bill.

138

The use of abstract signs by schedulers is discussed in Chapter 3. The significance of attributes and patterns among attributes is highly dependent upon the scheduler’s experiential knowledge of the working environment. 139

Whether they perceive the sign — in the Morris (1946) sense — as a signal, sign or symbol — in the Rasmussen (1990) sense — depends on whether they are using skill-, rule- or knowledge-based reasoning, as discussed in Chapter 4. 140

Morris (1946) distinguishes between signification and denotation. For the 278 : something that permits the object to denote there has to be a denotatum completion of the response-sequences to which the interpreter is disposed.

14

13

1

12

11

10

9

8

7

4

5

279

Figure 86. Jobs Window for Akira 3: Jobs and their characteristics. A partial display of objects representing job attributes.

2

3

6

16504 16742 16667 16498 16748 16356 16537

Gillette Duplex TDCar Triad Jarvis Triad Option

279 279 186 186 216 186 186

2 0 1 2 2 1 1

black

black red black blue blue brown

blue

grey

Back of Bill

280

Figure 87. Job Details observable in the Job Windows.

279 93 186 186 216 62 30/6 186

6/5 12/6 25/5 30/4 27/5

Front of Bill 76 187 50 32 49 149 50

229 241 345 244 305 246 229

2,3,4 1,2,3,4 1,2,3,4 2,3,4 2,3,4 1,2,3,4 1,2,3,4

3 3 3 3 3 3 3

Like the job tags on the machine loading board, the ProtoHIPSS also uses symbolic objects to signify jobs. Figure 86 shows these objects for a simplified version of the ‘Jobs Window’ for one of the presses.141 A legend for the labels in Figure 86 is presented in Table 29. Each job specification object (JSO) is a composite sign that denotes the characteristics of the final product. A horizontal line (labelled 1 in the figure) links all the elements of a JSO into a unified image characterising the job. Each element in the JSO signifies a particular job attribute (Figure 87 shows the data used to generate Figure 86) and is visually quite distinctive, thereby enabling schedulers to clearly distinguish between elements depicting different attributes.142 So they can discern the value of each attribute, elements are ‘two-dimensional’. For each job, the element that refers to the same attribute has some features in common with the JSOs for all other jobs. These ‘Global’ features allow identification of all elements denoting the same attribute.143 Global features such as shape, colour, size, location, and closure make the elements unmistakably different from each other. As the global features for each element are distinctly different to the surrounding elements, the scheduler can easily identify the element that signifies a particular attribute.144 For each

141

At Melamed, there are more attributes than these on a job tag, and they are also present in the Job Windows. For readability of the figures, and, to ensure the reader is not overwhelmed, these additional attributes have not been included. 142

By right clicking on a JSO, a scheduler can see and modify the value of each attribute in a ComboBox (a Microsoft Windows multiple input/output boxes). A written description of the characteristics of the job and its expected loading time, set-up time, processing times and completion time is also observable in a pop-up window. 143

When a Job Window is in full-window display a heading above each element of the sign reminds the scheduler of the referent attribute. 144

Experimental evidence suggests that visually similar graphical objects impair identification (Arend, Muthig and Wandmacher, 1987).

281

element, the scheduler can easily recognise the value of the referent attribute using ‘local’ features within the element that changes with the value. HHHHHH H H H HHHHHH H H H HHHHHH Figure 88. Global and Local Features

What is meant by ‘global’ and ‘local’ features is eloquently illustrated by Hoffman (1980) in describing a human face: “the visual form face consists of a head (global level) containing several details such as eyes, mouth, and so on (local level). These local features may in turn be global to even finer details such as the iris, pupil, and so forth.” Note that ‘global’ is not the same as ‘whole.’ A ‘whole’ would comprise both global and local features. Figure 88 shows the letter ‘S’ as a global feature and the letter ‘H’ for the local features. The constituent elements of the global and local letters are horizontal and vertical bars and symmetries (Navon 1981). Local and global features are clearly separable. The user can recognise a particular element by its global features and then identify the value of the referent attribute from its local features. The global features are obvious despite differences in the local features. In addition, the local features can vary without marring global integrity. For example, the element that signifies the depth of paper, and denotes the cylinder size for its production, is clearly identified in Figure 86, even where the local feature, the position and height of an unbroken vertical line, varies (compare labels 2 and 3). There are both alphanumeric and graphic elements in a JSO. The particular form of an element depends on the way its referent attribute affects scheduling decisions, and is underpinned by the theoretical issues regarding patterns in data, discussed in Chapter 4. The clear advantage of an alphanumeric form over a graphic form is the unambiguous display of the value of an attribute. Indeed, for customer names and job numbers it is preferable. A job number is a candidate key 282

that uniquely identifies the job. As it acts as a label, its character is nominal and is not the subject of quantitative comparisons. If, instead, the display for the job number were a graphic, a user would have to decode it to obtain its value. The ability of a user to read the characters clearly and distinctly sets the lower limit on size for alphanumeric displays. Graphic elements, where practicable, are used as they aid inferential processing.145 Users can quickly search for particular graphic signs in a large field of graphic signs (Arend, Muthig and Wandmacher, 1987; Treisman and Souther, 1985). From the emergent features that arise due to patterns in the values of the attributes, schedulers can see interactions and dependencies between jobs.146 Through inference, they can decide which factors are important for the current set of jobs. They can reorder the jobs on the screen to obtain patterns that are more desirable. This involves changing the order within a Job Window and moving jobs to other Job Windows, either singly or en bloc. From the patterns in the values of the attributes, schedulers can classify and group jobs. Observing interactions and dependencies between jobs, they can explore and test different arrangements. What particular attributes, and the patterns in the attributes, signify, and the responses these engender, depend upon experiential knowledge. Graphics have other advantages over alphanumeric characters for displaying attributes. Small graphic objects can still be visually conspicuous. By using small

145

It is easier to make perceptual inferences using distance, size, spatial coincidence, and colour comparisons, than it is to make logical inferences using mental arithmetic and numerical comparison (Casner, 1991; Larkin and Simon, 1987). Higgins (1994a; 1994b; 1996b) discusses these issues in detail. 146

An emergent feature depends on the identity and arrangement of the constituent parts, but is not identifiable with any single part. The arrangement of the parts makes it obvious. The emergent feature brings the parts together into wholes that “pop out” of the display (Pomerantz, 1986; Sanderson, Flach, Buttigieg, and Casey, 1989).

283

objects to depict attributes, more jobs can reside on a single screen.147 Consequently, schedulers can scan and compare the attributes for a larger set of jobs in the primary screen without scrolling (Figure 85). For signs to be effective they need to: (1) clearly and distinctively show the job attributes; (2) clearly display, unambiguously, the values of the attributes; (3) support the scanning of jobs to locate those jobs having a particular attribute value; (4) clearly display patterns in attributes across jobs. It is extremely important to satisfy these criteria to avoid screens cluttered with poorly designed tiny graphic objects. Graphic signs may have local features that have either substitutive or additive scales (Norman, 1991). In an additive scale, the representations can be ordered, whereas a substitutive scale has each new item replacing the one before it. In the ProtoHIPSS, vertical lines are used for the local features for elements that signify attributes that employ substitutive scales. The horizontal location of these lines denotes value. Why? Decision making in scheduling is heavily reliant on matching jobs with the same attribute values. This representation strongly supports such matching. Schedulers can scan along a vertical line within an elemental part of a JSO. To obtain a match, they would seek vertical alignment. They only have to decide whether a mark is present or not. Higgins (1994a) contends that this process has superiority over scanning vertically for a change in the shape of a component (e.g., Kleiner-Hartigan tree, star symbol plot, and Chernoff faces). Where values affect the overall shape of the graphic, the user would have to discriminate between shapes.148

147

By maximising data density a greater proportion of the data to be perused by the scheduler — the attributes for all available jobs — simultaneously on a screen. Leung and Apperley (1993) formalised this as the maximisation of efficiency. Higgins (1994b) discusses the types and features of different graphic forms in some detail. 148

Goodstein (1981) warns that the graphic has to be chosen so that the syntax of the display does not obscure the data.

284

Substitutive scales are used to signify the values for a job’s depth, width and colours, and for the press choices and press allocation. These attributes only have discrete values. The depth that a job may take is constrained by the size of the printing cylinder. The circumference of the cylinder has to be an exact multiple of the depth. Five cylinder sizes are available at Melamed. As changing a cylinder takes a long time, the scheduler is primarily interested in cylinder size, not depth. It is preferable to maintain the same values of these attributes between jobs, as a time penalty occurs where there is a difference. Within the elements signifying cylinder size and colour, the values of the attributes map to spatial location. For the printing cylinder, the five cylinder sizes are shown as vertical dashed lines in Figure 86. The horizontal location of the bar shows the size of the cylinder that must be used to produce the job, and the bar’s height expresses the depth as a fraction of the cylinder’s size. For example, the local feature in the cylinder’s sign for job 16742 is an unbroken vertical line (labelled 2) that is a third of the total height. The line third from the left represents a cylinder of 279-mm circumference. Hence, the depth of the job is 93 mm as it is produced on a 279mm cylinder (the values can be seen in Figure 87). A change in depth while maintaining the cylinder size is but a minor set-up. It requires only a perforating tool to be added or removed. In the sign representing the press allocation (the press allocation object, PAO), a large vertical line shows the machine to which the job has been allocated (labelled 5). The short, vertical lines indicate alternative machines that can produce the number of colours required. The element for the colour attribute has dashed lines that forms a rectangle (an emergent property) that acts as a global feature. The local features are thick vertical black lines that are on a substitutive scale. The horizontal location of a bar signifies a specific colour. Colours on the front of the bill (the sheet of paper) are shown lower (labelled 11) than those on the back of the bill (labelled 9). Printing the same colour on the front and back of a bill requires two separate modules, which is depicted by two collinear vertical bars (labelled 10). A different form supports the scanning of an additive scale. Mapping dimensional size to an attribute’s value is useful for the comparison of continuous variables. In the element of the JSO that signifies manufacturing time, the number and spacing 285

of the tick marks, representing hours, and the sign’s horizontal location together define the sign’s global features. Its local features are two horizontal bars. The lengths of the thick bar and the thin bar depict the processing time (labelled 7) and set-up time (labelled 8), respectively; the combined length signifies the manufacturing time. An additive scale makes it easy for the scheduler to compare processing times by scanning vertically. For example, the scheduler may seek a job to fill the time that remains before the shift ends, say two hours. To find a suitable job, the scheduler may fixate on the two-hour interval and then scan vertically. Jobs in which the horizontal line terminates to the left can be processed in the required time. These activities do not rely on the observer accurately reading the value of the attribute, but on comparison. The misreading of the relative order of values that are close together does not affect scheduling performance, as the margin of error between predicted and actual values of processing time is greater. This graphic therefore meets the requirements for effectiveness (Higgins, 1994a, 1994b).149 While additive scales generally suit size comparison, for cases requiring an exact numerical value, Arabic notation is clearly superior. For our example, arranging jobs so that width decreases between jobs saves time in setting up. In seeking a job to follow one with a width of 229 mm, the scheduler would scan the width of other jobs. If the scale was additive, the scheduler would find difficulty in deciding whether a width was 228 mm, which would incur no set-up penalty, or 230 mm, which would do so. Therefore, the object displays width in Arabic notation, a substitutive scale. Because the display is numeric, one would expect pattern matching to suffer, as it does not support perceptual inference. A graphic image, next to the numeric signifying the width (labelled 4), helps alleviate this problem. Where the width increases between jobs, while the cylinder size remains the same, there is a time penalty in changing jobs. The perforating tool has to be

149

In discussing effectiveness, Higgins makes particular reference to Mackinlay’s (1986) expressiveness and effectiveness, Leung and Apperley’s (1993) evaluation framework that considers the proportion of the represented data on the display screen, and Tufte’s (1983) removal of screen clutter using the principle of maximising the data-ink while erasing non-data-ink guides.

286

changed. The graphic draws the scheduler’s attention to those situations in which the width increases. For information regarding a job that is not in the plant’s database, the ProtoHIPSS has the facility to annotate the record with comments. Where such factors are infrequent, a simple visual cue can be used to remind schedulers to consider these notes: ProtoHIPSS uses the presence of a large red dot next to the job number as the cue. When a job that is displaying this dot is being considered in a scheduling strategy, the scheduler may access the reminder notes by clicking on the dot.

287

288

Figure 89. As a string of jobs is collected, the collected objects become shaded (labelled A) and the permissible presses become shaded (labelled C). The number of spokes on the blue ‘collector’ (labelled B) shows how many have been collected. When the ‘collector’ is right clicked, the job numbers are displayed in a pop-up box, listed in the sequential order of collection.

B

A

C

Human decision-makers should be able to attend to constraints at times appropriate to them. The proposition being put is that an HIPSS should not needlessly get in the way of decision-making. Users should be able to explore potential decisions in their own way, guided by their mental model of the scheduling process. Normally expert systems advise users through standard message boxes popping up onto the screen. Users have to immediately attend to the message. This can be quite disruptive to the scheduler’s ‘cognitive momentum.’ Graphic signs enable the ProtoHIPSS to pass messages unobtrusively. For soft constraints this is particularly efficacious, as the graphic labelled 4 in Figure 86, denoting the violation of a soft constraint, shows.150 An example of an unobtrusive message for a hard constraint relates to the choice of presses. In Figure 86 the black bars in the PAO — the sign for the presses — denote permissible choices. For example, the three bars (labelled 5) for job 16504 show three choices (which map to the press icons labelled 6), with the current choice indicated by the tallest bar.151 These constraints being always visible can be considered at any time. As the scheduler forms a string to be placed on a press, by left clicking on the JSOs, the ProtoHIPSS shows which presses the string can be allocated (see Figure 89). Of course, if a scheduler attempts to violate hard constraints the activities of the ProtoHIPSS will not remain in the background. It then sends an intrusive warning through a pop-up message box.

7.3.2 Relating the ProtoHIPSS to CWA How does the ProtoHIPSS relate to the goal structure, decision ladder and abstraction hierarchy shown in Figure 84? The ProtoHIPSS may help schedulers

150

It arises when a job has a lower value of width than its immediate predecessor, if, and only if, the predecessor uses the same cylinder. 151

The order of the presses, from left to right, are Akira 4, Akira 1 and Akira 3 which reflects the number of colours they can produce.

289

perceive the work domain at different levels of abstraction because of the following reasons.152 Functional purpose

Maximise long-term financial return

Priority/ Values

Maximise short-term financial viability

Maximise repeat custom

Job n Process paper to specified attributes

Purpose related function

Physical function

Physical device

jobno

Cust

To Plates

dd

D

Plates available

W

Q

Parts

SbS

FOB1

••• FOB4

Perforate

Print

Perforate

Print

Perforate

Fold

Cut

Fold

Cut

Fold

Cut

Ink

Cross Perforator

Ink

Cross Perforator

Ink

Cross Perforator

Plates

Cross Perforator

Plates

Cross Perforator

Plates

Cross Perforator

Cyl. Cutter

AKIRA1

BOB4

UV

Print

Perforate

Fold

Cut

Ink

Cross Perforator

Plates

Cross Perforator

Cyl. Cutter

paper Folder

speed

Fix

Cyl.

Cutter paper

•••

Paper

Print

Cyl.

BOB1

paper Folder

Cutter paper

Folder

speed

speed

AKIRA2

Folder speed

AKIRA3

AKIRA4

Figure 90. Feasible means-ends links for a particular job specification.

At the purpose-related function level, the ProtoHIPSS shows the constraints that specify the function, ‘process paper’, in the abstraction hierarchy (AH) developed in Chapter 6 and replicated in Figure 90. The list of objects in effect form a sequence of abstraction hierarchies, as discussed in Chapter 6 and replicated in Figure 91. The Jobs-Window display in the ProtoHIPSS is equivalent to a plan view of the pack of abstraction hierarchies displayed in Figure 91. Peering downwards, metaphorically, the user can see for each card in the pack the work domain constraints. For each job, the JSO and PAO provide affordances to the different levels of the associated AH. Figure 92 shows some of the links between the screen objects and the AH.153

152

The theoretical arguments presented here have yet to be empirically tested.

153

Showing all the links would be at the expense of clarity.

290

structural sequence

Figure 91. Structural sequence of abstraction hierarchies.

Functi onal

Maximise long-term financial return

purpose

Priority/

Maximise short-term financial viability

Values

Maximise repeat custom

Job n Process paper to specified attributes

Purpose relate d function

Print

Perforate

jobno

Cust

To Plates

Print

dd

D

Plates available

Perforate

W

Q

Parts

SbS

FOB1

••• FOB4

BOB1

•••

BOB4

Paper

Print

Perfor ate

Print

Perfor ate

Physical function

Fol d

Cut

Cros s Perforator

Ink Plates

Physical device

Cros s Perforator

Cyl.

Fol d

Ink Plates

Cut

Cross Perforator Cross Perforator

Cyl. Cutter

paper

Cut

Fol d

Cross Perfor ator

Ink

UV

Plates

paper

Cross Perfor ator Cross Perfor ator Cutter

paper Fol der

speed

AKIRA2

Cut

Cyl. Cutter

Fol der spee d

Fix

Cross Perfor ator

Cyl.

paper

AKIRA1

Ink Plates

Cutter

Fol der s peed

Fol d

Fol der speed

AKIRA3

AKIRA4

Figure 92. The relationship between the abstraction hierarchy and the signs in the Jobs Windows of the ProtoHIPSS.

291

The JSO provides access to the intentional constraints associated with the purpose-related function, ‘process paper,’ (i.e., the job’s attributes). Curved broken lines in Figure 92 depict the association between the constraints in the JSO and the AH. Users can also observe how the intentional constraints at the purposerelated level place constraints on the physical devices. The cylinder element of the JSO shows the depth — an intentional constraint set by the job specification — and the cylinder size, which is a configuration requirement for the press. Solid arcs having arrow ends show the link between the depiction in the Jobs Window and the constraint on the physical device. The PAO shows the constraints on permissible presses, which are derived from the purpose-related constraint, ‘the number of colours.’ Although the processing time is shown as an element of the JSO, from the perspective of WDA, it is not a specified attribute of the purposerelated function, ‘process paper,’ and therefore in terms of formal description it should be a separate sign.154 In Chapter 3, processing time was shown to be an attribute of the entity relation ‘IS_PROCESSED_BY’ between the entities, job and machine, as shown in Figure 93. However, in terms of WDA, the processing time is a constraint on the physical device. Its value is derived from the purposerelated constraints, ‘depth,’ ‘number of parts’ and ‘number of forms’ and the physical device constraint, ‘machine speed.’155 Accordingly, for an instantiation

154

Its inappropriate position is a consequence of the development of formalisms for describing an HIPSS proceeding concurrently with software development. Indeed, creating the formalisms was an iterative process of dialectical resolution between cognitive theory and scheduling praxis (Giddens, 1995). Unfortunately, there has been insufficient time to modify the display to a rendition of the current theory. 155

The formulae for calculating the processing time are given in Chapter 5. Although the calculation of processing time in the ProtoHIPSS is based on a constant machine speed, a simple refinement would allow the scheduler to modify it. The scheduler could then alter the speed to match operating conditions. In particular, quality affects the speed of the press. If rules on the quality-speed relationship were added to the knowledge base, the ProtoHIPSS could then advise 292

of the AH for a specific job, the description of the work domain at the physicaldevice level is formed by instances for both the Machine and OperationMachine classes.

Operation Operation# Method 1 Method 2 Due date Method 3 No. parts :

OperationMachine Operation# Method 1 Method 2 Machine# Method 3 Proc time

Machine Machine# Proc. speed

Method 1 Method 2 Method 3

: :

start time finish time

Purpose

Physical Device

Figure 93 Relations represented by classes: Classes Operation and Machine represent operation and machine entities and class OperationMachine depicts the relation IS_PROCESSED_BY

The JSO does not show the number of forms and the number of parts, as it was presumed at the time of its design that the scheduler only used them to calculate the processing time: the constraint on the physical device.156 However, a recent reanalysis of the data from the field study showed that Neil often pre-empted jobs at the change of parts, as discussed in Chapter 5. If Bertin’s (1981) dictum that all data should be comprehensively displayed to enable the user to discover relationships had been followed, then this information would not have been removed. The scheduler’s inability to attune to this information during the scan of the screen makes it difficult to draw inferences using patterns in these data. The processing time by itself does not allow traversal from the physical-device level to the purpose-related function level of the AH. It acts as a constraint on the scheduler’s degrees of freedom in making decisions. Only the scheduler can conceive the most useful subsets for the environmental context.

the scheduler on an appropriate speed setting through an unobtrusive sign. Using this approach, the scheduler could remain in cognitive control. 156

However, all the values of all the attributes, including these, are available on right clicking the JSO.

293

Set-up time is not derived from the mapping of the intentional constraints at the purpose-related function level to the physical-device level of the AH. Instead, it is a product of the juxtaposition of AHs. The ProtoHIPSS only shows the time for the major set-up. The time for changing the cross perforator was not included, as it is small compared to the time to change the cylinder: however, it will be included, as the ProtoHIPSS is refined. Time to change colours has not been included because of the complexity.157 In juxtaposing JSOs, differences in the local features of the elements signify a configuration change in moving from one AH to another. This may signify time lost in changing the set-up: it is a necessary but insufficient condition. A change from one cylinder size to another is a major set-up which always takes a significant time and is clearly observable, as labels 2 and 3 in Figure 94 show. Changing the cross perforator, when the width of paper decreases from one job to another, is a minor set-up. A graphic object (labelled 1) signifies that time would be lost in changing jobs. Another minor set up arises in changing from job 16504 (labelled 4) to job 16742 (labelled 2) because a cross perforator has to be removed, as the depth of paper changes. A change in colours between jobs may require no time to set up the press despite there being a change in configuration. Whether applicators have to be washed and reloaded depends on the colours available in all the applicators. Because these decisions are complex, as discussed in Chapter 5, schedulers must consider the patterns in colour usage across many jobs, and the significance that emerges depends very much upon the schedulers’ level of skill.

157

A simple refinement to the ProtoHIPSS would be to include the means for the scheduler to add set-up time on an ad hoc basis: however, care would then have to be taken to ensure that the scheduler then updated it when the sequence of jobs changed. This may result in an excessive and needless cognitive load on the scheduler. If the rules for setting up the colour applicators could be elicited, the ProtoHIPSS could then advise the scheduler through an unobtrusive sign on situations that might potentially cause a set-up.

294

cylinder setup time 4

perforator setup time

3

1

2

structural sequence

Figure 94. Set-up time is a factor that arises when the value of particular constraints differs between abstraction hierarchies.

The details in the goal structure, the decision ladders and the AH shown Figure 84 vary between schedulers. The particular problem-solving technique a person applies depends upon on experiential familiarity with the task (Moray et al., 1990). Therefore, it is important to design an HIPSS that has many degrees of freedom to allow for wide-ranging problem-solving strategies. An exhaustive analysis of possibilities will not be presented; instead, an example will be used to demonstrate the issues. As discussed in detail in Chapter 5, Neil aims to keep the four presses operating productively (i.e., goal 1C in the goal structure replicated in Figure 95) by seeking to fully utilise all presses (goal 1B). The principal way to achieve this is to minimise time wasted in setting up the presses (goal 2A). At Melamed three job attributes, depth, width, and colour, affect the set-up time. In the AH, the job attributes define the intentional constraints at the level of functional purpose. The constraints on depth, width and colours project to cylinder size (related to goal 2AS1), the location of broken teeth on the cross perforator (related to goal 2AS3) and the ink in the colour applicators (related to goal 2AS4), respectively, at the level of physical device. As these goals all relate to reducing the time lost in setting up the press, they concern the juxtaposition of JSOs, which was discussed above. To move the problem solution closer to meeting these goals, the scheduler could shuffle the job objects in the Job Window until a satisfactory pattern is obtained. To maximise the number of jobs before a cylinder change, jobs would be

295

rearranged so that the vertical line would be unbroken through the cylinder element of the JSOs. For the screen shown in Figure 86, moving job 16748, labelled 12, to the end of the string of jobs with the cylinder size labelled 13 would meet this condition, as there would be three unbroken lines. To minimise the time lost in changing colours, the scheduler would arrange the jobs in this group to minimise wash ups.

Maximise long-term financial return 1D

2D

1C

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

Figure 95. The scheduling goal structure.

The system not only supports the behaviour of the scheduler observed at Melamed, but supports the behaviour of any schedulers in this domain who had previously worked with the machine-loading board, as they may move jobs at will. Because the scheduler can observe constraints at both the purpose-related function and the physical device levels using the ProtoHIPSS, there is no restraint on decision-making strategies. As the ProtoHIPSS allows schedulers to move to different levels of abstraction in an ad hoc way, they should be able to solve scheduling problems in an opportunistic way (Woods and Roth, 1988). When acting opportunistically, schedulers may not completely follow through a decision strategy. In Chapter 6, a hierarchy of cyclic units was used to describe how the process of solving a problem is often approached through solving sub-problems. While a person may sketch out the hierarchy and then fill out the individual units as the problem progresses, the complete sequence may not be traversed. As 296

schedulers move through the set of transformations (i.e., subgoals) in the cyclic units, they may see other patterns that invoke new problem sets.158 The ProtoHIPSS allows them to partially complete the activity directed to a particular goal and focus on some other goal. Depending on how far the new transformations change the patterns in the data, the scheduler may return to unsolved sub-goals and proceed with any one of them at any time. For example, while developing a string of jobs that minimises set up, the scheduler could come across jobs that have high priority. He/she could direct his/her activity to scheduling these urgent jobs. When this scheduling activity has finished, the scheduler could return to the original activity. Information processing activities States of knowledge resulting from information processing Evaluate performance criteria

10

What are we aiming for?

Knowledge-based domain

6

Uncertain

Performance goal

5

11

What are the consequences, given goal?

Determine state consequences

5

What is the effect on this state?

9

9a KNOWLEDGE BASED ANALYSIS

State 8

4

Choose criterion

8c

12

8a 8b

Criterion

What policy satisfies criterion?

13 7c

Define policy: operationalise criterion

14

Data

7

Observe information and data

What is happening?

2 Alert

7b

Computer Support

7a

3

KNOWLEDGE BASED PLANNING

9c

9b

Identify state

What do these data signify?

consequences

State consequences

8d

Rule-based domain

Policy

15

How to carry it out? Determine steps: operationalise policy

16 Steps

1

17 Carry out

Activate

Skill-based domain

Figure 96. Computer support for rule-based decisions.

158

Such behaviour is not unique. Whitefield (1985) observed similar behaviour with users of CAD (Computer Aided Design) tools. He found that designers using CAD do not take one task goal, break it into sub-goals, solve each sub-goal in turn, and then take the next task goal.

297

The provision of OR heuristics allows schedulers to experiment with techniques, which they may otherwise never attempt without such a tool. If Neil were to use the ProtoHIPSS, he could add these heuristics to his repertoire of practices, if he found them useful. In effect, a new type of decision ladder would be formed that had the OR heuristics as procedural steps. The control tasks in the decision ladder may include a filtering process — the selection of jobs — and automated procedural steps. That is, the computer offers support for carrying out policies, as was shown for the MHS (Model Human Scheduler) in Chapter 4 and replicated in Figure 96. The shaded area on the right identifies where in the decision space the computer offers support. In effect, there are two decision ladders, as depicted by the two generic ladders in Figure 97; the human selects a scheduling policy and the ProtoHIPSS carries it out. EVALUATE performance criteria

EVALUATE performance criteria

Which goal to choose?

Which goal to choose? AMBIGUITY

ULTIMATE

AMBIGUITY

GOAL

ULTIMATE

GOAL

Which is then then target state?

Which is then then target state?

INTERPRET consequences for current task, safety, efficiency, etc.

INTERPRET consequences for current task, safety, efficiency, etc.

What is the effect on this state?

What is the effect on this state?

SYSTEM STATE

SYSTEM STATE

TARGET STATE

TARGET STATE

Which is the appropriate chage in operating cond.? IDENTIFY present state of the system

What lies beind?

SET OF OBSE RV.

Which is the appropriate chage in operating cond.? IDENTIFY present state of the system

DEFINE TASK Select appropriate change of syst. cond.

DEFINE TASK Select appropriate change of syst. cond.

What lies beind?

TASK

TASK

SET OF OBSERV.

How to do it? OBSERVE information & data What’s going on?

ALERT

ACTIVATION Detection of need for data processing

How to do it? OBSERVE information & data

FORMULATE PROCEDURE plan sequence of actions

FORMULATE PROCEDURE plan sequence of actions

What’s going on?

PROCEDU RE

PROCEDU RE

ALERT

ACTIVATION Detection of need for data processing

EXECUTE Coordinate manipulations

Human Scheduler

EXECUTE Coordinate manipulations

Computer

Figure 97. Human scheduler recognising relevant policy and computer carries out the procedure.

7.3.3 Improving Performance In the relationship between the goal structure, decision ladder and abstraction hierarchy in Figure 84, the apex of the goal structure coincides with the functional purpose level of the AH, and the level immediately below coincides with the highest-level priorities in the AH. While occasionally Neil’s attention was drawn to high-level goals, he mostly addressed goals that were immediate operational

298

objectives (e.g., low press set-up time across the shift) which form the lowest level of the goal structure shown in Figure 98. If his focus could have been raised to higher-level goals, then the performance of these goals might have been improved.159 Goals at the level above the immediate operational objectives tend to be directed towards functional goals. Some of these goals are commensurate with the traditional OR goals. For example, percentage utilisation is a suitable measure of performance for the goal ‘fully utilise all machines.’ Similarly, average tardiness is a suitable measure for the goal ‘all jobs delivered on their due date.’ Maximise long-term financial return 1D

Fully utilise all machines

1C

Functional Goals

Operational Objectives

Low press idle time

3C

2C

1B

2B

1A

2A

3A

4A

2AS1

2AS2

2AS3

2AS4

5A

All jobs delivered on their due date

2D

3B

4B

6A

7A

8A

5B

6B

7B

8B

9B

9A

10A

11A

12A

13A

14A

15A

Low press setup time

Figure 98. Schedulers’ practices commonly address operational objectives. An HIPSS should extend their interest to functional goals.

For each Jobs Window in Figure 85, a performance measure is shown in the top left corner that refers to the machine associated with the window. In the window for unallocated jobs, the measure refers to the performance across all presses. In Figure 86, the process performance measure (labelled 14) shows the average (weighted) tardiness for the jobs on Akira 3. The current version of the

159

This proposition, currently at the level of supposition, needs to be field-tested.

299

ProtoHIPSS allows schedulers to choose one of several performance measures. To fully extend scheduling behaviour to the direct consideration of higher-level goals, an HIPSS should provide a range of measures across the goal structure at the functional goal level. A seemingly credible way that Neil could extend his scheduling behaviour pertains to the formation of strings. In forming strings of jobs to satisfy a particular goal (e.g., ‘complete all jobs for a customer concurrently’), he often formed subgroups that met a secondary objective (e.g., ‘low press set-up time’). When he came to arrange the jobs within each subgroup, or within a complete string where subgroups did not exist, his decisions frequently appeared arbitrary, with no prevailing goal guiding decision making. However, if he could see how different arrangements of jobs within the subgroups affected performance associated with higher-level goals, he could experiment with ways to refine the ordering. Whether he would bother with the extra work is highly dependent upon his perception of the pay off for the effort expended. In developing practices that address this ‘micro’ ordering, he may find the supplied OR heuristics useful. To encourage the development of procedures that address higher-level goals through experimentation, schedulers ought to receive feedback on how potential moves might improve performance. While the ProtoHIPSS lacks the means for such aiding, a version of an HIPSS that incorporates such a feature is shown in Figure 99.160 For a selected measure — weighted tardiness for the case shown — a user can observe the particular machine’s performance (labelled 9) and the performance across all machines (labelled 10s). For each job, there is a graphic object on the right of the JSO that shows the job’s contribution to performance. For some jobs, the sign includes a solid rectangle, which is red in a colour display. Its width (labelled 4) is the value of the measure and its height (labelled 3) shows

160

This HIPSS has been designed for an experiment using a Parsifal data set for scheduling on identical parallel machines, which is planned to run mid 1999. The performance of parallel scheduling heuristics and a benchmark value (discussed in Chapter 5) will be compared to human schedulers, who can only call on simple scheduling heuristics (SPT, EDD and R&M).

300

the weight used for the job. A horizontal line (labelled 5), which is green in a colour display, shows the unweighted measure: tardiness for this case. Note that job 5 (labelled 8) is tardy. If the mouse pointer (labelled 7) is placed on this job, a bar appears between jobs 1 and 4 (labelled 6). This bar shows where to move the job for it to be produced on time. For jobs that are not tardy, the solid rectangle is replaced by a transparent rectangle (labelled 1), and a horizontal line (labelled 2), which is green as before, shows its earliness. There may be various arrangements that would give the same value of overall tardiness. On observing the distribution of tardiness, schedulers may assess the suitability of the arrangement in relation to other factors. For example, under some circumstances, they may prefer to distribute tardiness evenly across many jobs and at other times, they may decide to concentrate it in a few jobs. The robustness of the schedule and the importance of particular customers are two factors that, perhaps, might affect such a decision.

10 9

1

46

2 3

4

8

7

6

5

Figure 99. Display of the contribution from each job to the performance measure

301

7.4

The Contribution of HIPSS to Scheduling Practice

To obtain the necessary scale for sales, scheduling tools are generally designed for a particular industry sector rather than a specific plant.161 Designed to fit one or more classes of manufacturing with little or no adaptation to the specific environment, they include public or proprietary scheduling algorithms and standard methods for machine specification and routing (McKay, 1997). Usually they are integrated into the firm’s manufacturing information system and have various reporting capabilities and extensive options for describing the work. Juxtaposing these traits, is their inability to use decision strategies that are outside the restricted bounds set by the design. McKay points out that it is generally difficult, if not impossible, with these tools to add new types of data, constraints and heuristics. Consequently, they do not accommodate the vagaries of real scheduling environments. Even tools customised for a particular plant can only generate feasible schedules where the manufacturing environment is fully describable and sufficiently stable. When the situation is not well known, stable, or well defined, the problem solving activities must be adaptive and therefore better done by human schedulers (Wiers and McKay, 1996). The contribution of an HIPSS to scheduling practice is the replacement of ad hoc approaches to the design of interactive scheduling systems with a methodology that has a sound theoretical foundation and superstructure: 1. Its foundation is the centrality of the scheduler in the decision-making process; 2. CWA provides the superstructure; 3. Theories relating to signification and the form of object displays inform interface design.

161

Two examples of software for the printing industry are LMS (Logic Management System) — see http://www.zlogic.com/index.htm — and Prestige Scheduler — http://www.scheduling.co.nz/. A non-industry specific example of a generic scheduling software is the VirtECS Scheduler — http://www.combination.com/prodnsrv/schedule/schedule.html.

302

7.4.1 Centrality As human schedulers are located centrally in the decision-making architecture of an HIPSS, they can actively participate in the decision-making process without being tightly bound to a restrictive perspective of the problem, that is, their decision-making strategies are not restricted to a predetermined set. They can move around the decision space at will, including going beyond the limited information held within the plant’s database to other factors, especially those that are unstructured. They can apply their tacit knowledge, which has been developed through continuously gathering information as social beings (McKay, 1987). Consequently, they can fill in blank spots of missing information using their tacit knowledge. Being active agents in the decision-making process, they can provide information about constraint strengths, constraint relaxation, and penalties for constraint violations (McKay, 1992). To meet the requirement of centrality, an HIPSS is designed to allow schedulers to modify data without necessarily changing the parent database, thereby enabling them to tweak the system.162 From their experience, schedulers can: judge the quality of the information presented; estimate the values of parameters for which information is incomplete; judge and correct data where there are discrepancies between different records. For example, at Melamed, in using the information recorded in the database to write up the job tags, Neil sometimes changed values on entering them on the tags. These changes were necessary where the various persons entering data had made errors in data entry, due to typing mistakes or misconceptions about the meaning associated with some data fields. Changes also occurred for contextual reasons: because stock was already available from previous runs of the job, the order quantity, which sets the production quantity, was altered.163

162

They can change the values of both job attributes and derived parameters, such as set-up time and processing time. 163

In one case during the field study this situation arose because rework had been necessary in a previous run of the job: however, as there was insufficient time for 303

For humans to be in cognitive control, the ‘momentum’ of their decision making should not be unnecessarily disrupted. Therefore, a most important characteristic of an HIPSS is the offering of advice, where possible, in a non-intrusive way. The only exception occurs when the scheduler has ignored non-intrusive advice regarding a hard constraint and attempts to carry out an action that would, if allowed, violate the constraint. The HIPSS disallows the move and gives the user the reason for its refusal. Where automated procedures are available, for example, the OR heuristics in the ProtoHIPSS, the HIPSS is designed so the user maintains control. With the ProtoHIPSS, the scheduler decides to which string of jobs the heuristic will be applied. They can also undo and redo an action, whether automated or manual. They can manoeuvre freely in the decision space without being bound to irreversible decisions.

7.4.2 Cognitive Work Analysis Cognitive Work Analysis (CWA) provides the superstructure on which to build an HIPSS. The physical constraints on scheduling imposed by the manufacturing system within the work domain are analysed (WDA) using a means-ends abstraction hierarchy (AH). The levels of physical function and physical device in the AH are obtained through a detailed analysis of the form and function of the physical system: a process that is normally not problematic. Such an analysis was carried out in Chapter 6 for printing at Melamed, the results of which are replicated in Figure 63. For each job, the level of purpose-related function is derived from the job specification. In an environment such as printing in which the purpose-related function is singular, ‘process paper to specified attributes,’ there is generally a set list of constraints with only the values changing between jobs. In a general job shop, the purpose-related function can vary significantly and consequently the constraints themselves vary and not just their values. In designing a scheduling tool, the potential range of purposes have to be recognised to ensure that the affordances provided by the HIPSS support all possible control

reworking, the job was rerun and the original work was held over. It was then rectified before customer reordered the work.

304

tasks. This does not mean that unrecognised purposes will not be supported; it merely means that data may not displayed using signs (e.g., the JSO with the ProtoHIPSS) that map constraints from the purpose-related level to the physicaldevice level. Fold Special Finish Procure Continuous Paper

Continuous printing

Output Fan-fold forms

Perforate Collate & finish

Cut

Cut into sheet Procure Sheet Paper

Output Sheeted forms

Sheet printing

Hunkeler

Store

Reel of paper

AKIRA Minami

Sanden Store

Stack of paper

Boxed Fan-fold forms

Bowe

Boxed Sheeted forms

Trident

Figure 100. Ends-Means relationships between physical functions and physical resources.

Just as artisans express their skills through their tools, a well-design HIPSS allows expert schedulers to apply their skills with ease. This requires the affordances in the HIPSS to be in accord with the schedulers’ decision-making behaviour. To ensure that the affordances support the goal-directed behaviour of schedulers working in the domain, their goals are collected. This is a pragmatic process that uses whatever means available to the designers, who have to operate within the restrictions of the schedulers’ daily work activity.164 Although the methodology

164

It is usually extremely difficult to see schedulers away from the ongoing activities of the workplace. They cannot usually spare the time during their shift 305

for developing an HIPSS is not prescriptive in this regard, a common approach to data collection is to mix question and observation with perhaps follow-up questionnaires. The goals can then be mapped to the goal-structure. The higherlevel goals are the product of deduction, as discussed in Chapters 5 and 6. The data collection process can reveal many of the current control tasks for particular scheduling objectives, which can be mapped to decision ladders. However, for control tasks that are directed towards higher-level goals, which are not part of current scheduling practise, the broad aspects of the recognition and action cycles in the decision-ladders can be deduced to identify elements that could be supported by the HIPSS. For example, the high-level goal, ‘minimise average tardiness’, obviously requires the HIPSS to signal the average tardiness for a group of jobs. Constructing a decision ladder for a goal is an analytic process that allows the designer of an HIPSS to see what decision activities associated with the goal need to be supported. For example, to support the recognitional activities on the left side of the decision ladder, the designer may provide the means for observing each job’s tardiness. To support procedures for minimising tardiness, which are on the right side of the ladder, suitable affordances for carrying out the procedure can be included. For instance, where the procedure arranges jobs so that there is a string having the same machine configuration, then the HIPSS could provide suitable affordances for attuning the scheduler to the appropriate order of jobs (e.g., the graphical features for cylinder size in the printing case). Occasionally, where procedures are well defined, they can be wholly or partially automated, thereby relieving schedulers of some rule-based tasks. Whether the system learns the rules the human applies or someone directly enters them is a moot point; all the caveats regarding expert systems discussed in

and are too tired or unwilling to spend time outside the normal working day on what they perceive as extraneous activities.

306

Chapter 2 apply.165 In automating procedures, the stumbling block is uncertainty, as the presence of uncertainty makes it difficult to increase smartness. Any smart features installed must effectively and efficiently complement the human tasks (Wiers and McKay, 1996). A single decision ladder only represents part of the activity analysis (AA). As schedulers switch between goals, multiple decision ladders are required to model their behaviour. Where they follow a systematic strategy of stepwise improvement to the schedule, Volpert’s cyclic unit can be used to place each subgoal within the context of the primary goal, as described in Chapter 6 for minimising the number of set ups. Where these transformations can be recognised, the affordances can be designed for the easy traversal from one subgoal to the next.

7.4.3 Signification and the Display of Objects Having found which affordances should be present in the HIPSS, the designer has to create objects in the interface with the requisite properties for affordance. For some objects the surface properties merely need to reveal the action invoked; menu items or buttons describing the underlying action may suffice. For example, ‘SPT’ as a descriptor may provide sufficient attunement for a procedure that performs the ‘shortest to longest time’ ordering of jobs. Where schedulers have to find patterns in the data on which to draw inferences about possible scheduling strategies, the form of the signs is critical. Each representation has its own set of constraints and intrinsic and extrinsic properties. It emphasises some mappings at the expense of others. Therefore, their design must be well grounded in theory. For example, in designing the ProtoHIPSS, theories relating substitutive and additive scales and global and local features were exploited.

165

Wiers (1996) highlights the problems with extracting information from expert schedulers as they tend to document what they were taught and not what they have learnt.

307

7.4.4 Maintaining and extending expertise The use of an HIPSS occupies a significant portion of the schedulers’ attention and abilities (Sanderson, 1989). Its form helps schedulers perceive, interpret, and experience information during the decision-making process. They are therefore placed in a position where they can keep, and hone, their skills, particularly those relating to inductive logic and pattern-recognition (Sheridan, 1976).166 The form of an HIPSS can also help educate users. Where the display provides affordances to the higher levels in the goal structure, the scheduler’s performance may improve. The addition of automated procedures, which the users can try out, and signs that relate to the performance of higher-level goals may encourage schedulers to explore new scheduling strategies. Thus, the form of the HIPSS can help to restructure the user’s view of the scheduling problem.167 This has to be done with care, as schedulers are reluctant to apply rules that they see as obscure.

7.5

Limitations that need addressing

The contribution of the HIPSS paradigm to scheduling practice is a methodology for addressing decision making in complex systems in which there are many competing and conflicting goals. This methodology meets significant challenges, identified by Wiers (1996), in implementing scheduling techniques in practice: difficulties with dealing with complexity and inadequate interaction with the human scheduler. However, it does not directly address other challenges raised by Wiers. It does not advance a scheduling theory that decides what performance measures are relevant for environments that are uncertain and unstable. Neither

166

This avoids the problem of cognitive starvation that Wiers and McKay (1996) identified with intelligent scheduling systems that relied on expert system rules. 167

Hutchins, Hollan and Norman (1985) and Mitchell and Saisi (1987) discuss the relationship between the screen design and the restructuring of the user’s view.

308

does it offer a theory for developing new scheduling strategies that meet the requirements of robustness. Another dimension that yet needs addressing concerns the inherent weaknesses of humans. While humans bring particular insight into solving scheduling problems, they may need to be manoeuvred away from their cognitive biases that may lead them to follow an inappropriate strategy. Sage (1981) identifies twenty-seven biases, some of which are: fixating on the first approach to a problem solution that they identify to the exclusion of other possibilities; a predilection for using easily recalled information; the allure of spurious cues. While the methodology can produce a hierarchy of goals and relate high-level goals to affordances that may encourage the exploration of strategies that focus on these higher goals, it does not explicitly address how to discourage ineffective metrics that are part of firmly established behaviour. The scheduling problem being too large for human schedulers to comprehend was identified by the SIGMAN workshop as a general limitation of interactive scheduling systems. The problem size affects the efficacy of the ProtoHIPSS: nonetheless, it meets the design requirements for a simple interactive scheduling tool to be used in a small job shop with only a few major machines and a strong similarity between jobs. Which affordances are appropriate depends upon the dimensions of the problem (e.g., the number of jobs and operations). For systems in which the scheduler’s attention cannot be solely directed to the bottleneck, the JSO may need to provide affordances to the constraints on the potential machines for all or, at least, the significant operations. To reduce screen clutter, focussing and zooming features may be necessary. To ensure that the scheduler’s navigational and cognitive momentum is not disrupted, implementation of such features requires care (Rasmussen and Pejtersen, 1995). For large systems, the JSO may signify a string of jobs that have been grouped on a specific attribute. The elements of the sign may then denote the relative dominance of other attributes. Using the Melamed case as an example, the JSO in Figure 101 signifies

309

a string of 30 jobs (labelled 2) for the same customer (labelled 1).168 In this group, there are more jobs requiring the cylinder labelled 3 than that labelled 4 and more jobs requiring the colour labelled 6 than the colour labelled 5. Access to the constraints associated with each job in the group is still necessary: one feasible way of getting to the JSO for each job in the string would be through clicking on the JSO for the group. 1

2

Acme

3

4

5

6

30

Figure 101. A conceptual representation of a JSO for a group of jobs.

The development of an HIPSS for large problems is not just a matter of screen design but also requires the defining of control tasks for aggregating the elements of the large problem into a manageable number of groups. Whether the aggregation is automated or manual, it must be under the control of the human scheduler. The form of these control tasks for a particular environment is ascertained through a CWA of the domain.

7.6

Summary

The argument presented in this chapter is that the design of interactive scheduling systems needs to be informed by theory. Formalisms for designing an HIPSS were developed that: 1. Locate human activity centrally in the decision-making process;

168

This JSO is merely a conceptual demonstration, and its form is not a product of rigorous and thorough design. An obvious omission is affordance to a higherlevel goal (e.g., tardiness).

310

2. Use an extended form of Cognitive Work Analysis to analyse the work domain and decision-making activities in complex systems in which there are many competing and conflicting goals; 3. Use theories of signification to inform interface design. By applying the methodological tools developed for an HIPSS, the benefits of interactive scheduling systems, which the SIGMAN workshop anticipated, are expected to be realisable: 1. The pursuit of goals and the enforcement of constraints that are difficult to represent computationally; 2. The following of methods that schedulers find natural; 3. The freedom for schedulers to use their intuition.

311

Chapter 8 Conclusions, Reflections and Future Work

Hybrid Human-Computer Intelligent Automation (HHCIA), in which human and machine ‘intelligence’ are combined, was related to the established models for supervisory control of continuous processes. ‘Hybrid’ was used in this thesis to signify human and machine intelligence combining cooperatively. Two distinct phases of control in HHCIA were delineated: the control during the processing of a batch of work and the control of the changeover of batches. In discrete manufacture the planning of the changeovers, and hence production scheduling, was shown to be central. The focus of the thesis was therefore on the development of formalisms for representing human-computer decision making associated with the control of the changeover of batches in a job-shop environment. This provided the foundation for the development of a hybrid intelligent human-computer paradigm for the scheduling component of HHCIA. In this thesis, a hybrid intelligent human-computer paradigm for job shop scheduling that supports human schedulers operating in environments characterised by uncertainty and instability has been advanced. Its contribution to scheduling practice is a methodology for addressing decision making in complex systems in which there are many competing and conflicting goals. Theoretical constructs have been developed on which to base the design of interactive scheduling systems. The major research question posed by this research, as stated in Chapter 1, has the following linked components: 1. What are the scheduling strategies employed by human schedulers in job-shop scheduling? 2. How are these influenced by the abstractions employed in the data presented to the scheduler about the state of the system and its performance requirements? 312

3. What are the most appropriate formalisms for developing a hybrid intelligent human-computer production scheduling system (HIPSS)? The first question was advanced through a discussion of the general characteristics of the decision-making environment in job-shop scheduling. It was argued that schedulers, in relating job characteristics to the state of the working environment, use their deep knowledge of the domain, which includes non-manufacturing data, to draw inferences about possible scheduling strategies. Their perception of constraints, objectives, and decision heuristics adapts as the manufacturing system changes. To understand how they solve scheduling problems, scheduling was placed within a systems-thinking context in which human schedulers make decisions through purposeful rational action. As scheduling activity is characterised by complexity and perplexity, intentional models, rather than functional analysis, were used to describe the purposive action of schedulers. That is, the behaviour of the manufacturing process is constrained by a scheduler’s intentions. Through an ethnographic study of a scheduler operating in a printing company, the strategies he employed in scheduling under perplexity were used as a footing for developing formal constructs for describing how schedulers interpret signals from the environment and work out appropriate actions. To analyse how scheduling strategies are influenced by the abstractions employed in the data presented to the scheduler about the state of the system and its performance requirements, Cognitive Work Analysis (CWA) was enlisted. The work domain analysis (WDA) and activity analysis (AA) tools of CWA were applied to the data from the field study. A WDA of the field data was used to develop an ends-means description of the physical functions and physical resources of the job shop using an abstraction hierarchy (AH). The goals that the scheduler sought and the operational policies he applied were also found. AA was used to represent scheduling activity. Just as the AH is a formal descriptor for WDA, which provides a generic framework for describing goal-oriented systems, the decision ladder was used as a formal descriptor for the activities in AA. The goals that the scheduler sought were consolidated into a structure that linked goals at various levels of abstraction.

313

The currently available CWA tools were found inadequate for representing human decision-making processes in discrete-event systems. New tools were developed to extend the current formalisms that included the goal structure, multiple decision ladders and abstraction hierarchies. A relationship was shown between the decision ladder, the goal structure and the abstraction hierarchy. The dynamic aspects of goal setting and attainment were represented using the cyclic units of Action Regulation Theory, developed by German work psychologists. Having developed formalisms for representing decision-making, the question of how scheduling strategies are influenced by abstractions employed in data presentation was addressed. It was argued that schedulers seek different information from a display, depending upon where they are in the decision ladder. A scheduler seeks different patterns in the data at different conjunctures. To support the various types of reasoning, the display needs to provide perceptual affordances that allow the scheduler to effortlessly extract information required for any level of the abstraction hierarchy. By mapping surface features of the display to functional and intentional constraints, schedulers can perceive information at multiple levels of abstraction. The form of the signs for representing data was show to be critical, as schedulers have to find patterns in the data on which to draw inferences about possible scheduling strategies. Each representation has its own set of constraints and intrinsic and extrinsic properties. It emphasises some mappings at the expense of others. Therefore, the form of data representation must be well grounded in theory. Signs used to represent data need to: 1. Clearly and distinctively show the job attributes; 2. Clearly display, unambiguously, the values of the attributes; 3. Support the scanning of jobs to locate those jobs having a particular attribute value; 4. Clearly display patterns in attributes across jobs. A prototype HIPSS (ProtoHIPSS) was developed to demonstrate the applicability of the formalisms used to represent the decision-making activity of human schedulers to the design of a HIPSS that can support schedulers in perplex environments. It was argued that an HIPSS that presents information to schedulers in a way that they can directly perceive the attunements, would reduce the need 314

for any mediating inferential process. A system that substitutes visibility for storage and replacing mental operations with visual inferences would therefore effectively support its users as the attunements act as external memory. The contribution of an HIPSS to scheduling practice is the replacement of ad hoc approaches to the design of interactive scheduling systems with a methodology that has a sound theoretical foundation and superstructure: 1. Its foundation is the centrality of the scheduler in the decision-making process; 2. CWA provides the superstructure; 3. Theories relating to signification and the form of object displays inform interface design. The HIPSS paradigm is a methodology for addressing decision making in complex systems in which there are many competing and conflicting goals. It does not advance a scheduling theory that decides what performance measures are relevant for environments that are uncertain and unstable. Neither does it offer a theory for developing new scheduling strategies that meet the requirements of robustness. Just as in the process industry, the operator in the control room has instruments for displaying the state variables and performance measures of the plant and has alarms to warn of critical constraint violations, an HIPSS provides an environment for schedule control, with features for showing the state of the schedule, for indicating performance and for warning the violation of constraints. As the understanding of scheduling factors in the domain is improved, new indicators and automated procedures, which are under the command of the schedulers, may be incorporated in a hybrid intelligent production scheduling system. By being actively involved in decision making, the human can deal with contingencies and other aspects of scheduling jobs that are difficult to vest in a computer decisionmaker. The use of intelligent human decision-makers with vast local knowledge also obviates the need for an exhaustive knowledge base. The hybrid intelligent human-computer paradigm does not call for the abandonment of classical OR methods, but instead accepts Morton and Pentico’s (1993) proposition that “All useful approaches should be pursued.” The hybrid approach to scheduling can help to invigorate Operations Research, as heuristics,

315

whose applicability has been under question, may become quite useful when applied judiciously. The human can act as an intermediary between the real-world manufacturing environment and the abstract world of operations research, by: • Dealing with the stated and unstated conflicting goals; • Resolving how to use information that is incomplete, ambiguous, biased, outdated, or erroneous; • Grouping jobs to meet the specific criteria for applying selected heuristics. Although it is argued that an HIPSS is expected to realise the benefits of interactive scheduling systems, which the SIGMAN workshop anticipated, this claim will remain at the level of conjecture until results are obtained for its application to real scheduling environments. The ProtoHIPSS will be refined over the next year. Once complete, a site for studying its application in the field will be sought.

10 9

1

46

2 3

4

8

7

6

5

Figure 102. Display of the contribution from each job to the performance measure

An HIPSS has been designed for an experiment on scheduling identical parallel machines, which is planned to run mid 1999. The data sets that will be used are from the Parsifal software, which accompanies Morton and Pentico’s (1993) text. The scheduling problem in this experiment is defined within the OR paradigm and has only a few parameters: due date, processing time, and penalty weight. The screen layout, replicated in Figure 99, was discussed in Chapter 7. Two groups of 316

experimental subjects will be divided been a graphic-based display, as shown in Figure 103, and text-based display shown in Figure 104. Both screens show exactly the same data.

Figure 103. The graphic form of the screen for the experiment.

Figure 104. The text form of the screen for the experiment.

The performance of parallel scheduling heuristics and a benchmark value produced by Parsifal, as discussed in Chapter 5, will be compared to subjects using an HIPSS. The subjects will be able to call on simple scheduling heuristics (SPT, EDD and R&M) and parallel versions of SPT and R&M. The measure of performance will be weighted tardiness. The hypotheses that will be tested are: 1. Subjects using an HIPSS to make fully manual moves or to apply simple heuristics to any set of jobs can produce schedules with comparable performance to the parallel heuristics applied to the full set of jobs.

317

2. Subjects using parallel heuristics to first distribute jobs between machines can make further improvements by making fully manual moves or by applying simple heuristics to any set of jobs. The two forms of the screen will be used to test the following hypothesis: subjects using graphic signs designed to the principles discussed in Chapters 4 and 7, can produce better schedules than subjects using alphanumeric signs. Some questions providing a focus for the experimental design are: • How hard or easy is it for humans to identify differences and similarities in the graphic symbols? • How does an object’s graphic form affect the scheduler’s search strategy? • How many graphic objects on a screen can a human reasonably manage? • What is the upper bound on the number of jobs that a human can search and manipulate? Finally, the methodological tools developed in this thesis may be extended beyond production scheduling to decision-making in other complex decision environments in which there are many competing and conflicting goals.

318

Bibliography

Abbott, L. S. ( 1982). Proceedings of workshop on cognitive modeling of nuclear power plant control room operators (NUREG/CR-3114). Oak Ridge, TN: Oak Ridge National Laboratory. Ackoff, R. L. (1979a). The future of operational research is past. Journal of the Operational Research Society, 30, 93-104. Ackoff, R. L. (1979b). Resurrecting the future of operational research. Journal of the Operational Research Society, 30, 189-199. Ackoff, R. L., and Emery, F. E. (1972). On purposeful systems, London: Tavistock. Ammons, J. C., Govindaraj, T., and Mitchell, C. M. (1986). Human aided scheduling for FMS: A supervisory control paradigm for real-time control of flexible manufacturing systems. Annals of Operations Research 15, 313-335. Ammons, J. C., Govindaraj, T., and Mitchell, C. M. (1988). Decision models for aiding flexible manufacturing system scheduling and control. IEEE Transactions on Systems, Man, and Cybernetics, SMC-18, 744-756. Anthony, R. N. (1988). The Management Control Function, Boston: Harvard Business School Press. Arend, U., Muthig, K.-P. and Wandmacher, J. (1987). Evidence for global feature superiority in menu selection by icons, Behaviour & Information Technology, 6, 411-426. Arnold, B. and Roe, R. (1987). User errors in human-computer interaction. In M. Frese, E. Ulich, & W. Dzida (Eds.) Psychological issues of humancomputer interaction in the work place, Amsterdam: North-Holland, pp. 203-220. 319

Aström, K. L. (1985). Process Control - past present and future. IEEE Control Systems Magazine, 5, 3-10. Bainbridge, L. (1981). Mathematical equations or processing routines? In J. Rasmussen and W. B. Rouse Human Detection and Diagnosis of System Failures, New York: Plenum Press, pp. 259-286. Bainbridge, L. (1983). Ironies of automation. Automatica, 19 (6), 775-779. Baker, K. R. (1974). Introduction to Sequencing and Scheduling, Wiley. Barfield, W., Hwang, S-L., and Chang, T-C. (1986). Human-computer supervisory performance in the operation and control of flexible manufacturing systems. In: A. Kusiak (Ed.) Flexible Manufacturing Systems, North-Holland, 377-408. Barnard, P. J. (1987). Cognitive resources and the learning of human-computer dialogs. In J. M. Carroll (Ed.), Interfacing Thought: Cognitive Aspects of Human-Computer Interaction, Cambridge, MA: MIT Press, pp. 112-158. Barr, A., Feigenbaum, E. A. (Eds.) (1981). The Handbook of Artificial Intelligence, Volume I, A Reading, MA: Addison-Wesley. Bartholdi, J. J. III, and Platzman, L. K. (1988). Heuristics based on spacefilling curves for combinatorial problems in Euclidean space. Management Science 34 (3), 291-305. Bauer, A., Bowden, R., Browne, J., Duggan, J., Lyons, G. (1991). Shop Floor Control Systems: From Design to Implementation, Chapman & Hall. Bauer, A., Bowden, R., Browne, J, Duggan, J. and Lyons, G., (1994). Shop Floor Control Systems: From Design to Implementation, 2nd ed., Chapman & Hall. Beer, S. (1966). Decision and Control, London: John Wiley.

320

Beishon, R. J. (1974). An analysis and simulation of an operator’s behaviour in controlling continuous baking ovens. In E. Edwards and F. P. Lees (Ed.) The Human Operator in Process Control, London: Taylor & Francis, pp. 79-90 [reprint from F. Bresson and M. de Montmollin (Ed.) (1969). The simulation of human behavior, Paris, Dunod]. Belton, V., and Elder, M. D. (1996). Exploring a Multicriteria Approach to Production Scheduling. Journal of the Operational Research Society 47, 162-174. Beltracchi, L. (1987). A direct manipulation interface for water-based rankine cycle heat engines. IEEE Transactions on Systems, Man, and Cybernetics, SMC-17, 478-487. Ben-Arieh, D. (1988). Knowledge-based routing and control system for FMS. In S. T. Kumara, A. L. Soyster, R. L. Kashyap (Eds.) Artificial intelligence: manufacturing theory and practice, Norcross, pp. 631-646. Benda, P. and Sanderson, P. (1998). Towards a dynamic model of adaptation to technological change. Proceedings of the Australia-New Zealand Conference in Computer-Human Interaction (OzCHI98), Adelaide, South Australia, IEEE Computer Society, pp. 244-251. Bereiter, S. R., and Miller, S. M. (1988). Sources of difficulty in Troubleshooting Automated Manufacturing Systems. In W. Karwowski (Ed.) Ergonomics of Hybrid Automated Systems I: Proceedings of the First International Conference on Ergonomics of Advanced Manufacturing and Hybrid Systems, Louisville, Kentucky, August 15-18, 1988, Elsevier, pp. 37-50. Bereiter, S. R. and Miller, S. M. (1989). A field-based study of troubleshooting in computer-controlled manufacturing systems. IEEE Transactions on Systems, Man, and Cybernetics, SMC-19, 205-219. Bergeron, H. P. (1981). Single-pilot IFR autopilot complexity/benefit trade-off study. Journal of Aircraft, 260, 39-47. Bertin, J. (1981). Graphics and Graphic Information-Processing. De Gruyter.

321

Blackstone, J. H., Phillips, D. T., Hogg, C. L. (1982). A state of the art Survey of Dispatching rules for manufacturing job shop performance. International Journal of Production Research, 20 (1), 27-45. Blattberg, R. C., and Hoch, S. J. (1990). Database models and managerial intuition: 50% model and 50% manager. Management Science, 36, 887899. Blazewicz, J., Domschke, W., and Pesch, E. (1996). The job shop scheduling problem: Conventional and new solution techniques. European Journal of Operational Research, 93, 1-33. Blumberg, M. and Alber, A. (1982). The human element: Its impact on the productivity of advanced batch manufacturing systems. Journal of Manufacturing Systems, 1, 45-53. Bødker, S. (1991). Through the Interface: A Human Activity Approach to User Interface Design, Hillsdale, NJ: Lawrence Erlbaum Associates. Boothroyd, K. E. (1978). Articulate Intervention, Taylor and Francis, London. Bovair, S., Kieras, D. E., and Polson, P. G. (1990). The acquisition and performance of text-editing skill: A cognitive complexity analysis. Human Computer Interaction, 5, 1-48. Bower, G. H. (1972). Mental images and associative learning. In L. Gregg (Ed.) Cognition in learning and memory, New York: Wiley. Brödner, P. (1985). Skill based production: the superior concept to the “unmanned factory”. In H.-J. Bullinger and H. J. Warnecke Toward the Factory of the Future: Proceedings of the 8th International Conference on Production Research and 5th working Conference of the Fraunhofer-Institute for Industrial Engineering, Berlin: Springer-Verlag , pp. 500-505. Brödner, P (1990). Technocentric-anthropocentric approaches: towards skillbased manufacturing. In M. Warner, W. Wobbe and P. Brödner (Eds.) New Technology and Manufacturing Management: Strategic Choices for Flexible Production Systems, Wiley, pp. 101-112. 322

Brown, D. E., Marin, J. A., Scherer, W. T. (1995). A survey of intelligent scheduling systems. In D. E. Brown and W. T. Scherer (Eds.) Intelligent Scheduling Systems, Kluwer Publishers, pp. 1-40. Browne, J., Boon, J. E., and Davies, B. J. (1981) Job shop control. International Journal of Production Research, 19 (6), pp. 633-643. Browne, J., Harhen, J., Shivnan, J. (1988). Production Management Systems: A CIM Perspective, Addison-Wesley. Brubaker, R. (1984). The Limits of Rationality: An Essay on the Social and Moral Though of Max Weber, London: Allen and Unwin. Buxey, G. (1989). Production scheduling: practice and theory. European Journal of Operational Research, 39, 17-31. Buzacott, J. A. and Yao, D. D. (1986). Flexible Manufacturing Systems: a review of analytic models. Management Science, 32(7), 890-905. Card, S. K. (1984). Human Limits and the VDT Computer Interface. In J. Bennett, D. Case, J. Sandelin and M. Smith (Eds.) Visual Display Terminals: Usability Issues and Health Concerns, Prentice-Hall, pp. 117-155 [reprinted in Baecker, R. M. and Buxton, A. S. (Eds.) (1987). Readings in Human-Computer Interaction: A Multidisciplinary Approach, Morgan Kaufmann Publishers, San Mateo, CA, USA, pp. 180-191]. Card, S., Moran, T., and Newell, A. (1983) The Psychology of Human-Computer Interaction, Hillsdale, NJ: Lawrence Erlbaum Associates. Carroll, D. C. (1965 ). Heuristic Sequencing of Single and Multiple Component Jobs, Ph.D. dissertation, Sloan School of Management, M.I.T. Casner, S. M. (1991). A task-analytic approach to the automated design of graphic presentations. ACM Transactions on Graphics, 10 (2), 111-151. Checkland, P. B. (1981). Systems Thinking, Systems Practice, John Wiley & Sons.

323

Cheng. T. C. E. and Gupta, M. C. (1989). Survey of scheduling research involving due-date determination decisions. European Journal of Operational Research, 38, 156-166. Chubb, G. P., Laughery, K. R., Jr, and Pritsker, A. A. B. (1987). Simulating manned systems. In G. Salvendy (Ed.) Handbook of Human Factors, Wiley, pp. 1298-1327. Churchman, C. W. (1971). The design of inquiring systems, Basic Books. Chou, C. K., Jeng, W. H. and Jeng, Y. C. (1988). A dynamic dispatching rule in computerised manufacturing systems. In A. Mital (Ed.) Recent Developments in Production Research: Collection of Refereed Papers Presented at the IXth International Conference on Production Research, Elsevier, pp. 292-297. Coffman, E. G., Garey, M., and Johnson, D. (1978). An application of binpacking to multi-processor scheduling. SIAM Journal of Computing, 7, 1-16. Conway, R. W. (1965a). Priority Dispatching and work in process inventory in a job shop. Journal of Industrial Engineering, 16 (2) 123-130. Conway, R. W. (1965b). Priority dispatching and job lateness in a job shop. Journal of Industrial Engineering, 16 (4), 228-237. Conway, R. W., Maxwell, W. L. (1962). Network dispatching by shortest operation discipline. Operations Research, 10 (51). Conway, R. W. Maxwell, W. L., Miller, L. W. (1967). Theory of Scheduling, Addison-Wesley. Craik. K. J. W. (1943). The Nature of Explanation, Macmillan. De Montmollin, M., and De Keyser, V., 1986, Expert logic versus operator logic. In G. Mancini, G. Johannsen, and L. Martensson (Eds.) Analysis, Design, and Evaluation of Man-Machine Systems, New York: Pergamon, pp. 4349.

324

Drury, C. G., and Prabhu, P. V. (1994). Human factors in test and inspection. In: G. Salvendy and W. Karwowski (Eds.), Design of Work and Development of Personnel in Advanced Manufacturing, John Wiley & Sons, 355-401. Dudek, R. A., Panwalker, S. S., and Smith, M. L. (1992). The lessons of flowshop scheduling research. Operations Research, 40 (1), 7-13. Dunkler, O., Mitchell, C. M., Govindaraj, T., and Ammons, J. (1988). The effectiveness of supervisory control strategies in scheduling flexible manufacturing systems. IEEE Transactions on Systems, Man and Cybernetics, SMC-18, 223-237. Dutton, J. M. (1962). Simulation of an actual production scheduling and workflow control system. International Journal of Production Research, 1 (4), 21-41. Dutton, J. M., and Starbuck, W. (1971) Finding Charlie's run-time estimator. In J. M. Dutton and W. Starbuck (Eds.) Computer simulation of human behavior, New York: Wiley, pp. 218-242. Eberts, R. E. (1993). User Interface Design. Englewood Cliffs, NJ: Prentice Hall. Eberts, R. E., & Eberts, C. G. (1989). Four approaches to human computer interaction. In P. A. Hancock & M. H. Chignell (Eds.) Intelligent Interfaces: Theory, Research and Design, Elsevier Science Publishers (North-Holland), pp. 69-127. Edlund, C. and Lewis, M. (1994). A Metric for Situated Difficulty. Unpublished, copy courtesy of the authors. Edlund, C., Weise, C. and Lewis, M. (1995). Context engineering for human problem solving. IJCAI-95 Workshop on Modeling Context in Knowledge Representation and Reasoning. August 20. Montreal, Canada. Edwards, E. and Lees, F. P. (1973). Man-Computer interaction in process control, Institute of Chemical Engineers, London.

325

Eggleston, R. G. (1987). The changing nature of the human-machine design problem: Implications for system design and development. In W. B. Rouse (Ed.) System Design: Behavioral Perspectives on Designers, Tools, and Organizations, Amsterdam: North Holland, pp 113-125. Elleby, P., Fargher, H. E. and Addis, T. R. (1988). Reactive constraint-based system for industrial job-shop scheduling. In M. D. Oliff (Ed.) Expert Systems and Intelligent Manufacturing, North-Holland, pp. 1-10. Elvers, D. A. (1973). Job shop dispatching rules using various delivery date setting criteria. Production Inventory Management, 14, 62. Emmons, H. (1987). Scheduling and sequencing algorithms. In John A. White (Ed.) Production Handbook, 4th ed., New York: John Wiley & Sons, pp. 3-159–3-175. Erschler, J., Roubellat, F. and Thuriot, C. (1985). Scheduling with resource constraints. International Conference EURO VII, Bologne, 17-19 June. Erschler, J., and Roubellat, F. (1989). An approach to solve workshop real time scheduling problems. In S. Y. Nof & C. L. Moodie (Eds.) Advanced Information Technologies for Industrial Material Flow Systems, Heidelberg: Springer-Verlag, 651-679. Evans, G. W. and Karwowski, W. (1986). A perspective on mathematical modeling in human factors. In Waldemar Karwowski and Anil Mital (Eds.) Applications of Fuzzy Set Theory in Human Factors, Elsevier, pp. 9-12. Ferguson, R. L. and Jones, C. H. (1969). A computer aided decision system. Management Science, 15, B550-B561. Fisher, J. (1992). Use of nonfinancial performance measures. Journal of Cost Management, Spring 1992, 31-38. Flach, J. M. (1988). Direct manipulation, direct engagement, and direct perception: What’s directing what? Proceedings of the Human Factors Society, 32, 1355-1358. 326

Flach, J. M. (1990). The ecology of human-machine systems I: Introduction. Ecological Psychology, 2, 191-205. Flach, J. M. (1995). The ecology of human-machine systems: A Personal History. In J. M. Flach, P. Hancock, J. Caird, and K. J. Vicente (Eds.). Global Perspectives on the Ecology of Human-Machine Systems, Hillsdale NJ: Lawrence Erlbaum Associates, pp. 1-13. Foulds L.R. (1984). Combinatorial optimization for undergraduates, Springer. Fox, B. R. and Kempf, K. G. (1985). Opportunistic scheduling for robotic assembly. Proceedings of the IEEE International Conference on Robotics and Automation, Saint Louis, Missouri. Fox, M. S. (1990). AI and expert system myths, legends and facts. IEEE Expert, Feb. 1990. Fox, M. S., and Sadeh, N. (1990) Why is scheduling difficult? A CSP perspective. Proceedings of the 9th European Conference on Artificial Intelligence, August 6-10, 1990, Stockholm, Sweden, pp. 754-767. Fox, M. S., and Smith, S. F. (1984). ISIS: A knowledge-based system for factory scheduling. Expert Systems, 1, 25-49. French, S. (1982). Sequencing and Scheduling: An Introduction to the Mathematics of the Job-Shop, Chichester: Ellis Horwood. Garey, M. R. and Johnson, D. S. (1979). Computers and Intractability: A Guide to the Theory of NP-Completeness, San Francisco: W. H. Freeman & Co. Gary, K,. Uzsoy, R., Smith, S. P., and Kempf, K. (1995). Measuring the quality of manufacturing schedules. In D. E. Brown and W. T. Scherer (Eds.) Intelligent Scheduling Systems, Kluwer Publishers, pp. 129-154. Gault, R. (1984). OR as education. European Journal of Operational Research, 16, 293-307.

327

Gaver, W. (1991). Technological affordances. In Proceedings of the CHI '91 Conference on Computer and Human Interaction. New York: ACM. Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin. Gibson, J. J. (1977). The theory of affordances. In R. Shaw & J. Bransford (Eds.), Perceiving, acting and knowing. Hillsdale, N.J.: Lawrence Erlbaum Associates. Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin. Gibson, R., and Laios, L. (1978). The Presentation of Information to the Job-Shop Scheduler. Human Factors, 20 (6), 725-732. Giddens, A. (1995). A contemporary critique of historical materialism. 2nd ed. Basingstoke: Macmillan. Goldratt, E., Cox, J. (1986). The Goal, USA: Creative Output Books. Goodstein, L. P., Andersen, H. B., and Olsen, S. E. (Eds.) (1988). Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis. Gorry, G. A. and Scott Morton, M. S. (1971). A framework for management information systems. Sloan Management Review, Fall, 55-70. Grabot, B. and Geneste, L. (1994). Dispatching rules in scheduling: a fuzzy approach. International Journal of Production Research, 32(4), 903-915. Graham, R. L., Lawler, E. L., Lenstra, J. K., and Rinnooy Kan, A. H. G. (1979) Optimisation and approximation in deterministic sequencing and scheduling: a survey. Annals of Discrete Mathematics, 5, 287-326, Grant, T. J. (1986). Lessons for O.R. from A.I.: A Scheduling Case Study. Journal of Operations Research Society, 37(1), pp. 41-57.

328

Graves, S. C. (1981). A review of production scheduling. Operations Research, 29 (4), 646-675. Green, T. R. G., 1990, Limited theories as a framework for human-computer interaction. In D. Ackermann & M. J. Tauber (Eds.) Mental Models and Human-Computer Interaction 1 Amsterdam: North-Holland pp. 3-37. Greif, S. (1991) The role of German work psychology in the design of artifacts. In J. M. Carroll Designing interaction: Psychology at the human-computer interface, Cambridge University Press, pp. 203-226. Habermas, J. (1971). Toward a Rational Society, London: Heinemann. Hacker, W. (1982). Wanted: A grammar of actions? cognitive control of goaldirected actions (Review II). In Hacker, W., Volpert, W., and von Cranach, M., (Eds.) Cognitive and Motivational Aspects of Action. Amsterdam: North Holland, pp. 17-24. Hacker, W. (1986). Arbeitspsychologie: Psychische Regulation von Arbeitsträtigkeiten, Berlin: VEB Deutscher Verlag der Wissenschaften. Hacker, W. (1994). Action regulation theory and occupational psychology. Review of German Empirical Research Since 1987. The German Journal of Psychology, 18(2), 91-120. Hacker (1998). Action control and motor performance in work. In Sport Kinetics, Edition zwalina. Hamburg: Feldhaus. Hacker, W., Volpert, W., and von Cranach, M., (Eds.)(1982). Cognitive and motivational aspects of Action, Amsterdam: North-Holland. Haider, S. W., Moodie, C. L., and Buck, J. R. (1981). An investigation of the advantages of using a man-computer interactive scheduling methodology for job shops. International Journal of Production Research, 19, 381-392.

329

Hancock, P. A., and Chignell, M. H. (1995). On human factors. In J. M. Flach, P. Hancock, J. Caird, and K. J. Vicente (Eds.). Global Perspectives on the Ecology of Human-Machine Systems, Hillsdale NJ: Lawrence Erlbaum Associates, pp. 14-53. Hansen, J. P. (1995). Representation of system invariants by optical invariants in configural displays for process control. In P. Hancock, J. M. Flach, J. Caird, and K. J. Vicente (Eds.) Local Applications of the Ecological Approach to Human-Machine Systems, Hillsdale NJ: Lawrence Erlbaum Associates, pp. 208-233. Hax, A. C. (1987) Aggregate production planning. In John A. White (Ed.) Production Handbook, 4th Ed., New York: John Wiley & Sons, 3-116—3127. Hettenbach, D. A., Mitchell, C. M., and Govindaraj, T. Decision-making in supervisory control of a flexible manufacturing system. Information and Decision Technologies, 17, pp. 255-278, 1991. Higgins, P. G. (1992). Human-computer production scheduling: contribution to the hybrid automation paradigm. In P. Brödner and W. Karwowski (Eds.) Ergonomics Of Hybrid Automated Systems - III: Proceedings of the Third International Conference on Human Aspects of Advanced Manufacturing and Hybrid Automation, Gelsenkirchen, August 26-28 1992, Germany, Elsevier, pp. 211-216. Higgins, P. G. (1993). Hybrid human-computer production scheduling. In Proceedings of ACME'93, Adelaide, November 1993, pp. 203-207. Higgins, P. G. (1994a). A graphical display to support human-computer decisionmaking in production scheduling. In P. T. Kidd and W. Karwowski (Eds.) Advances in Agile Manufacturing: Fourth International Conference on Human Aspects of Advanced Manufacturing and Hybrid Automation, July 6-8 1994, Manchester: IOS Press, pp. 317-320.

330

Higgins, P. G. (1994b). Graphical features for aiding decision-making in production scheduling. In Proceedings of OZCHI 94, Melbourne, Australia, 1994, pp 261-266. Higgins, P. G. (1995). Interaction in hybrid intelligent production scheduling. In F. Burstein P. O'Donnell A. Gilbert (Eds.) Proceedings of the Melbourne Intelligent Decision Support Workshop, March 20, 1995, Monash University, pp. 98-102. Higgins, P. G. (1996a). Interaction in hybrid intelligent scheduling. The International Journal of Human Factors in Manufacturing, 6(3), 185-203. Higgins, P. G. (1996b). Using graphics to display messages in an intelligent decision support system. In F. Burstein, H. Linger, H. Smith (Eds.) Proceedings of the Second Melbourne Workshop on Intelligent Decision Support Systems, IDS’96, Monash University, September 9th 1996, Monash University, pp. 32-38. Higgins, P. (1998). Extending cognitive work analysis to manufacturing scheduling. In P. Calder and B. Thomas (Eds.) Proceedings 1998 Australian Computer Human Interaction Conference, OzCHI’98, November 30-December 4, Adelaide, IEEE, pp. 236-243. Higgins, P. G., and Wirth, A. (1995). Interactive scheduling: how to combine operations research heuristics with human abilities. 6th International Conference on Manufacturing 95, Melbourne, The Institution of Engineers, Australia, pp. 293-302. Higgins, P. G., and Wirth., A. (1997). Interactive scheduling. APORS’97 Proceedings: Melbourne 1-4 December 1997. [Online] Available http://www.maths.mu.oz.au/~worms/apors/program/. Hoch, S. J., and Schkade, D. A. (1996). A psychological approach to decision support systems. Management Science, 42(1), 51-64.

331

Hoffman, J. E. (1980). Interaction between global and local levels of form. Journal of Experimental Psychology: Human Perception and Performance, 6, 222-234. Hollnagel, E. (1981). Report from the third NKA/KRU experiment: The performance of control engineers in the surveillance of a complex process, NKA/KRU-P2(81)36, Risø National Laboratory. Hsu, W., Prietula, M. J., Thompson, G. L. and Ow, P. S. (1993). A mixedinitiative scheduling workbench - integrating AI, OR, HCI. Decision Support Systems, 9, 245-257. Hughes, J. G. (1988). Database Technology: A Software Engineering Approach, Prentice-Hall International, Hemel Hempstead. Husserl, E. (1936). The origin of geometry. In Luckmann, T. (Ed.) (1978). Phenomenology and Sociology, Harmondsworth: Penguin Books. Hutchins, E. (1995). How a cockpit remembers its speed. Cognitive Science, 19 (3) 265-288. Hutchins, E., Hollan, J., and Norman, D. A. (1986). Direct manipulation interfaces. In D. A. Norman & S. Draper (Eds.) User centered system design: New perspectives in human-computer interaction, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 87 – 124. Huttenlocher, J. (1968). Constructing spatial images: A strategy in reasoning. Psychological Review, 75, 550-560. Hwang, S.-L., and Salvendy, G., (1983). Human supervisory performance and subjective responses in flexible manufacturing systems. Proceedings of the Human Factors Society 28th Annual Meeting, Santa Monica, Human Factors Society, 1983, pp. 664-669. Hwang, S.-L., Sharit, J., and Salvendy, G. (1983). Management strategies for the design, control and operation of flexible manufacturing systems. Proceedings of the Human Factors Society 27th Annual Meeting, Santa Monica, CA, 1983, Human Factors Society, pp. 297-301. 332

Jackson, G. A. (1988). Relational Database Design with Microcomputer Applications, Englewood Cliffs, NJ: Prentice-Hall International. Jackson, J. R. (1957). Networks of waiting lines, Operations Research, 5, 518. Jackson, S. and Browne, J. (1989). An interactive scheduler for production activity control. International Journal of Computer Integrated Manufacturing, 2(1), 2-15. Jain, A. K. (1987) Manufacturing Management. In John A. White (Ed.) Production Handbook, 4th Ed., New York: John Wiley & Sons, pp. 158—1-65. Jones, C. V., and Maxwell, W. L. (1986). A system for manufacturing scheduling with interactive computer graphics. IIE Transactions, 298-303. Johnson, Bob (1992). Department of Management Science Pennsylvania State University, personal communication, Jan 29, 1992. Johnson, G. I., and Wilson, J. R. (Ed.) (1988). Ergonomics matters in advanced manufacturing technology. Applied Ergonomics, 19 (1), whole issue. Johnson, S. M. (1954). Optimal two and three-stage production schedules with setup times included. Naval Research Logistics Quarterly 1, 61-68. Johnson-Laird, P. N. (1983). Mental Models: Towards a Cognitive Science of Language, Inference, and Consciousness, Cambridge University Press. Karmarkar, U. S. (1988). A hierarchical scheduling system for the CIM environment. In A. Mital (Ed.) Recent Developments in Production Research: Collection of Refereed Papers Presented at the IXth International Conference on Production Research, Elsevier, pp. 271-277. Karwowski, W. (1988). Introduction. In W. Karwowski (Ed.) Ergonomics of Hybrid Automated Systems I: Proceedings of the First International Conference on Ergonomics of Advanced Manufacturing and Hybrid Systems, Louisville, Kentucky, August 15-18, 1988, Elsevier.

333

Keen, P. G. W., and Scott Morton, M. S. (1978). Decision Support Systems. An Organisation Perspective, Reading, Mass: Addison-Wesley. Kempf, K., Le Pape, C., Smith, S. F., and Fox, B. R. (1991). Issues in the design of AI-based schedulers: A workshop report. AI Magazine, 37-46. Kerr, R. (1991). Knowledge-Based Manufacturing Management: Applications of artificial intelligence to the effective management of manufacturing companies, Addison-Wesley, Sydney. Kerr, R. M. and Ebsary, R. V. (1988). Implementation of an expert system for production scheduling in a small manufacturing company - A case study. Journal for Operational Research, 33, 17-29. Kieras, D., and Polson, P. G. (1983). A generalized transition network representation for interactive systems. CHI'83 Proceedings, 103-106. Kieras, D., and Polson, P. G. (1985). An approach to the formal analysis of user complexity. International Journal of Man-Machine Studies, 22, 365-394. Kinsley, A.-M. (1994). Application of ecological interface design to advanced manufacturing: Revised dissertation proposal. Unpublished report, Center for Cognitive Science, State University of New York, Buffalo. Kinsley, A.-M., Sharit, J., and Vicente, K. J. (1994). Abstraction hierarchy representation of manufacturing: towards ecological interfaces for advanced manufacturing systems. In P. T. Kidd and W. Karwowski (Eds.) Advances in Agile Manufacturing: Fourth International Conference on Human Aspects of Advanced Manufacturing and Hybrid Automation, July 6-8 1994, Manchester. IOS Press, pp. 297-300. Kirakowski, J. (1997). Affordances. [Online] Available Usability Testing listserv. [email protected] 22 September 1997. Kirlik. A. (1992). Where affordance notions are and aren’t useful. [Online] Available usenet comp.human-factors 25 January 1992.

334

Kirlik, A. (1995). Requirements for psychological models to support design: Towards ecological task analysis. In J. M. Flach, P. Hancock, J. Caird, and K. J. Vicente (Eds.) Global Perspectives on the Ecology of HumanMachine Systems, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 68120. Kirlik, A., Miller, R. A., and Jagacinski, R. J. (1993). Supervisory control in a dynamic and uncertain environment: A process model of skilled humanenvironment interaction. IEEE Transactions in Systems, Man, and Cybernetics, SMC-23, 929-952. Klein, G. (1989). Recognition-primed decisions. In W. B. Rouse (Ed.) Advances in Man-Machine Systems Research, Vol. 5, Greenwich, Connecticut: JAI Press, pp. 47-92. Kleinmuntz, B. (1990). Why we still use our heads instead of formulas: toward an integrative approach. Psychological Bulletin, 107, 296-310. Krosner, S. P., Mitchell, C. M., and Govindaraj, T. (1989). Design of an FMS operator workstation using the Rasmussen abstraction hierarchy. Proceedings of the 1989 International Conference on Systems, Man, and Cybernetics New York: IEEE, pp. 959-964. Kuhn, T. (1970). The Structure of Scientific Revolutions, 2nd ed., The University of Chicago Press. Kusiak, A. (1990). A knowledge and optimization-based approach to scheduling in automated manufacturing systems. In D. E. Brown and C. C. White, III (Eds.) Operations Research and Artificial Intelligence: The Integration of Problem-Solving Strategies. Boston: Kluwer Academic Publishers, pp. 453-479. Larkin, J., and Simon, H. (1987). Why a diagram is (sometimes) worth 10,000 words. Cognitive Science 11, 65-99.

335

Lawler, E. L., Lenstra, J. K., Rinnooy Kan, A. H. G. and Shmoys, D. B. (1993). Sequencing and scheduling: algorithms and complexity. In S. C. Graves, A. H. G. Rinnooy Kan, and P. H. Zipkin (Eds.) Logistics of Production and Inventory, Amsterdam: Elsevier, pp. 445-522. Le Pape, C. (1992). Subject: Planning v. Scheduling? [Online] Available usenet comp.ai, 22 April 1992. Lewis, C. M. (1991). Visualization and situations. In J. Barwise, J. M. Gawron, G. Plotkin and S. Tuyiya (Eds.) Situation Theory and its Applications. vol 2, CSLI Publications, Stanford University, 535-563. Lewis, C. M. (1997). Ecological cognition in visualization [Online] Available http://www.pitt.edu/~cmlewis/nsf.html [Accessed 16 July 1997]. Lind, M. (1988). System concepts and the design of man-machine interfaces for supervisory control. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis, pp. 269-277. Lind, M. (1991). Decision models and the design of knowledge-based systems. In J. Rasmussen, B. Brehmer, and J. Leplat (Eds.) Distributed Decision Making: Cognitive Models for Co-operative Work, John Wiley & Sons. Lindsay, R. W., and Staffon, J. D. (1988). A model based display system for the experimental breeder reactor-II. Joint meeting of the American Nuclear Society and the European Nuclear Society, Washington, DC, November, 1988. Lintern, G. and Naikar, N. (1998). Cognitive work analysis for training system design. in P. Calder and B. Thomas (Eds.) Proceedings 1998 Australian Computer Human Interaction Conference, OzCHI’98, November 30 – December 4, Adelaide, IEEE, pp. 252-259.

336

MacCarthy, B. L., and Liu, J. (1993). Addressing the gap in scheduling research: a review of optimization and heuristic methods in production scheduling. International Journal of Production Research, 31, 59-79. McKay K.N. (1987). Conceptual Framework for Job Shop Scheduling, MASc Dissertation, Department of Management Science, University of Waterloo. McKay K.N. (1992). Production Planning and Scheduling: A Model for Manufacturing Decisions Requiring Judgement, PhD Dissertation, Department of Management Science, University of Waterloo. McKay, K. (1994). Personal correspondence. [Online] Available email: [email protected] from [email protected], January 6, 1994. McKay, K.N. (1997). Scheduler adaptation in reactive settings - design issues for context-sensitive scheduling tools. Proceedings of IE Conference Practice and Applications, November 1997, San Diego. McKay K.N., and Buzacott, J.A. (1999). Adaptive production control in modern industries. In P. Brandimarte and A. Villa (Eds.). Modeling Manufacturing Systems: From Aggregate Planning to Real-Time Control. SpringerVerlag. McKay, K. N., Buzacott, J. A., and Safayeni, F. R. (1989a). The scheduler’s knowledge of uncertainty: The missing link. In J. Browne (Ed.) Knowledge Based Production Management Systems, Amsterdam: Elsevier, pp. 171-189. McKay, K. N., Buzacott, J. A., and Safayeni, F. R. (1989b). The schedulers desk—can it be automated? Decisional structure in automated manufacturing. Proceedings of the IFAC/CIRP/IFIP/IFORS Workshop, Genoa, Italy, September 18-21, pp. 57-61.

337

McKay, K. N., Buzacott, J. A., and Safayeni, F. R. (1989c). The scheduler’s information system: what is going on? insights for automated environments, information control problems in manufacturing technology. Selected papers from the IFAC/IFIP/IFORS/IMACS Symposium, Madrid, Spain, September 26-29, pp. 327-331. McKay, K. N., Safayeni, F. R., and Buzacott, J. A. (1988). Job-shop scheduling theory: what is relevant? Interfaces, 18 (4), 84-90. McKay, K. N., Safayeni, F. R., and Buzacott, J. A. (c1992). Conventional Wisdom Versus Field Driven Views of Planning and Scheduling, unpublished, copy courtesy of the author. McKay, K.N. and Wiers, V. C. S. (1997). Unifying the theory and practice of production scheduling. Under second review, special issue of Journal of Manufacturing Systems on empirical research on production control. Mackinlay, J. (1986). Automating the design of graphical presentations of relational information. ACM Transactions on Graphics, 5(2), 110-141. Mackworth, N. H. (1948). The breakdown of vigilance during prolonged visual search. Quarterly Journal of Experimental Psychology, 1, 22-40. Matlin, M. W. (1988). Sensation and Perception. Allyn and Bacon. Meister, D. (1966). Human factors in reliability. In W. G. Ireson (Ed.) Reliability Handbook, McGraw-Hill. Mellor, P. (1979). A review of job-shop scheduling. Operational Research, 27, 792-798. Meshkati, N. (1990). Integration of workstation, job, and team structure design in complex human-machine system. In W. Karwowski, M. Rahimi (Eds.) Ergonomics of Hybrid Automated Systems II, Amsterdam: Elsevier, pp. 59-68.

338

Miller, J. R., Sullivan, J. W. and Tyler, S. W. (1991). Introduction. In J. W. Sullivan and S. W. Tyler (Eds.) Intelligent User Interfaces, Reading, Mass: Addison-Wesley, pp. 1-10. Mitchell, C. M. (1987). GT-MSOCC: A domain for research on human-computer interaction and decision aiding in supervisory control systems. IEEE Transactions on Systems, Man and Cybernetics, SMC-17 (1), 553-572. Mitchell, C. M. (1990). Supervisory control: philosophical considerations in manufacturing systems. In A. P. Sage (Ed.) Concise Encyclopedia of Information Processing Systems and Organisations, Oxford: Pergamon Press. Mitchell, C. M., Govindaraj, T., Armstrong, J. E., Benson, C. R., Hettenbach, D. A. (1991). Human supervisory control of predominantly automated manufacturing processes: conceptual issues and empirical investigations. Control and Dynamic Systems, Vol 46, 255-306. Mitchell, C. M., and Miller, R. A. (1986). A discrete control model of operator function: A methodology for information display design. IEEE Transactions on Systems, Man and Cybernetics, SMC-16 (3), 343-357. Mitchell, C., and Saisi, D. (1987). Use of model-based qualitative icons and adaptive windows in workstations for supervisory control systems. IEEE Transactions on Systems, Man, and Cybernetics, SMC-17 (1). Mital, A. (Ed.) (1988). Recent Developments in Production Research: Collection of Refereed Papers Presented at the IXth International Conference on Production Research, Elsevier. Merabet, A. A. (1986). Dynamic job shop scheduling: an operating system based design. In A. Kusiak (Ed.) Flexible Manufacturing System: Methods and Studies, North-Holland, 257-270. Mohnkern, K. (1997). Beyond the interface metaphor. SIGCHI Bulletin April 1997.

339

Moray, N. (1986). Monitoring behavior and supervisory control. In K Boff, L. Kaufman & J. Thomas (Eds.), Handbook of Perception and Human Performance (vol.2). New York: Wiley, pp. 40-1—40-51. Moray, N. (1988). Prologue: Ex risø semper aliquid antiquum: sources of a new paradigm for engineering psychology. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis, pp 116-127. Moray, N. (1997). Models of models of … mental models. In T.B. Sheridan and T. Van Lunteren (Eds). Perspectives on the Human Controller, Mahwah, NJ: Lawrence Erlbaum. 271-285. Moray, N. (1999). Mental models in theory and practice. In D. Gopher and A. Koriat (Eds.) Attention and Performance XVII: Cognitive Regulation of Performance: Interaction of Theory and Application, Cambridge, MA: MIT Press. Moray, N., Dessouky, M. A., Kijowski, B. A., and Adapathya, R. (1990). Strategic Behavior, Workload and performance in Task Scheduling, EPRL-90-06, University of Illinois at Urbana-Champaign. Moray, N., Eisen, P., Money, L. and Turksen, I.B.. (1988). Fuzzy analysis of skill and rule-based mental workload. In P. A. Hancock (Ed.) Human Mental Workload, Amsterdam: North Holland. Morris, C. W. (1946). Signs Language and Behavior. New York: G. Braziller Morris, N. M. and Rouse, W. B. (1985). Review and evaluation of empirical research in troubleshooting. Human Factors, 27 (5), 503-530. Morton, T. E., and Pentico, D. W. (1993). Heuristic Scheduling Systems: With Applications to Production Systems and Project Management, New York: John Wiley & Sons.

340

Morton, T. E., and Smunt, T. L. (1986) A planning and scheduling system for flexible manufacturing. In A. Kusiak (Ed.) Flexible Manufacturing System: Methods and Studies, North-Holland, pp. 151-164. Morton, T. E., Lawrence, S. R., Rajagopolan, S., and Kekre, S. (1986). MRPSTAR: PATRIARCH’s Planning Module. GSIA, Carnegie Mellon University, Pittsburgh, PA. Morton, T. E., Lawrence, S. R., Rajagopolan, S., and Kekre, S. (1988). SCHEDSTAR: a price-based shop scheduling module. Journal of Manufacturing and Operations Management, 1, 131-181. Moyer, R. S. (1973). Comparing objects in memory: Evidence suggesting an internal psychophysics. Perception and Psychophysics, 13, 180-184. Mumford, E. (1987). Sociotechnical systems design: evolving theory and practice. In G. Bjerknes, P. Ehn and M. Kyng (Eds.) Computers and Democracy: A Scandinavian Challenge, Aldershot, England: Avebury, Gower Publishing Limited, pp. 59-76. Nakamura, N. (1990). Human modifying knowledge acquisition for FMS scheduling. In W. Karwowski, M. Rahimi (Eds.) Ergonomics of Hybrid Automated Systems II, Amsterdam, Elsevier, 1990, pp. 331-338. Nakamura, N., and Salvendy, G. (1987). Human decision making in computerbased scheduling within a flexible manufacturing system: an experimental study. In G. Salvendy (Ed.) Cognitive Engineering in the Design of Human-Computer Interaction and Expert Systems, Elsevier, pp 257-264. Nakamura, N., and Salvendy, G. (1988). An experimental study of human decision-making in computer-based scheduling of flexible manufacturing systems. International Journal of Production Research, 26, 567-583. Nakamura, N., and Salvendy, G. (1994). Human planner and scheduler. In: G. Salvendy and W. Karwowski (Eds.), Design of Work and Development of Personnel in Advanced Manufacturing, John Wiley and Sons, 331-354.

341

Nakamura, N., Shin, F.-Z., and Salvendy, G. (1991). Development and validation of a human-supervised system in scheduling of flexible manufacturing systems: A comparative study with knowledge-based and human operated systems. Journal of the Chinese Institute of Industrial Engineers, 8 (1), 35-51. Navon, D. (1981). The forest revisited: more on global precedence. Psychological Research 43, 1-32. Neches, R., Langley, P., and Klahr, D. (1987). Learning, development and production systems. In D. Klahr, P. Langley and R. Neches (Eds.) Production System Models of Learning and Development, Cambridge MA: MIT Press. Neisser, U. (1993). The self perceived. In Neisser, U. (Ed.) (1993). The perceived self: Ecological and interpersonal sources of self-knowledge, Cambridge University Press, pp. 3-21. Newell, A. and Simon, H. A. (1963). GPS, a program that simulates human thoughts. In E. Feigenbaum and J. Feldman (Eds.) Computers and Thought, New York: McGraw-Hill. Newell, A., and Simon, H. A. (1972). Human problem solving, Englewood Cliffs, NJ: Prentice-Hall. Newman, P. A. (1988). Scheduling in CIM systems. In A. Kusiak (Ed.) Artificial Intelligence in Industry: Implications for CIM, New York: SpringerVerlag, pp. 361-402. Norman, D. A. (1986). Cognitive engineering. In D. A. Norman & S. Draper (Eds.) User centered system design: New perspectives in human-computer interaction, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 31-61. Norman, D. A. (1988). The Psychology of Everyday Things. New York: Basic Books.

342

Norman, D. A. (1991). Cognitive Artifacts. In J. M. Carroll (Ed.) Designing interaction: Psychology at the human-computer interface, Cambridge University Press, pp. 17- 38. Panwalkar, S. S. and Iskander, W. (1977). A survey of scheduling rules. Operations Research, 25, 45-61. Papantonopoulos, S. (1990). A Decision Model for Cognitive Task Allocation, Doctoral thesis, Purdue University. Paradies, M. W. (1985). Cognitive allocation and the control room. In Proceedings of the IEEE Third Conference on Human Factors and Power Plants, Monterey, CA. Pew, R. W., and Baron, S. (1983). Perspectives on Human Performance Modelling. Automatica, 19(6), 663-676. Polanyi, M. (1966). The Tacit Dimension, London: Routledge & Kegan Paul. Popper, K. R. (1972). Objective Knowledge, Oxford: Clarendon Press. Popper, K. R. (1976). The logic of the social sciences. In T. W. Adorno (Ed.). The Positivist Dispute in German Sociology, London: Heinemann. Rajgopal, J., and Bidanda, B. (1991). On scheduling parallel machines with two setup classes. International Journal of Production Research, 29(12), 24432458. Ranky, P. G. (1986). Computer-integrated manufacturing, Englewood Cliffs, NJ: Prentice-Hall. Rasmussen, J. (1976). Outlines of a hybrid model of the process plant operator. In T. B. Sheridan and G. Johannsen (Eds.) Monitoring Behavior and Supervisory control, New York: Plenum, pp. 371-383. Rasmussen, J. (1983). Skills, rules, and knowledge; signals, signs and symbols, and other distinctions in human performance models. IEEE Transactions Systems Machines and Cybernetics 13(3), 257-266.

343

Rasmussen, J. (1985). Human Error Data: Fact or Fiction, Risø Report M-2499, Roskilde, Denmark: Risø National Laboratory. Rasmussen, J. (1986). Information Processing and Human Machine Interaction: An Approach to Cognitive Engineering, New York: North-Holland. Rasmussen, J. (1987). The role of hierarchical knowledge representation in decision making and systems management. In A. P. Sage (Ed.) System Design for Human Interaction, IEEE Press, pp. 301-310. Rasmussen, J. (1988). A cognitive engineering approach to the modeling of decision making and its organization in process control, emergency management, CAD/CAM, office systems and library systems. In: W. B. Rouse (Ed.): Advances in Man-Machine System Research, Vol. 4, J.A.I. Press Inc. pp. 165-243 Rasmussen, J. (1990). Mental models and the control of action in complex environments. In D. Ackermann & M. J. Tauber (Eds.) Mental Models and Human-Computer Interaction 1, North-Holland, pp. 41-69. Rasmussen, J. (1998a). Ecological interface design for complex systems: An example: SEAD-UAV systems. EOARD-Contract: F61708-97-W0211. Final report, April. Rasmussen, J. (1998b). Notes on abstraction hierarchy analysis. [Online] Available Work Domain Analysis Workbench Users listserv. [email protected] 06 Jul 1998. Rasmussen, J., and Goodstein, L. P. (1986). Decision support in supervisory control. In G. Mancini, G. Johannsen, L. Mårtensson (Eds.). Analysis, Design, and Evaluation of Man-Machine Systems. Pergamon, pp. 79-90. Rasmussen, J., and Jensen, A. (1974). Mental procedures in real life tasks: a case study of electronic trouble shooting. Ergonomics, 17(3), 293-307.

344

Rasmussen, J., and Pejtersen, A. M. (1995). Virtual ecology of work. In J. M. Flach, P. Hancock, J. Caird, and K. J. Vicente (Eds.). Global Perspectives on the Ecology of Human-Machine Systems, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 121-156. Rasmussen, J., Pejtersen, A. M., and Goodstein, L. P. (1994). Cognitive Systems Engineering. John Wiley & Sons. Reason, J. (1988). Framework models of human performance and error: a consumer guide. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis, pp. 3549. Reason, J. (1990). Human Error, Cambridge: Cambridge University Press. Rinnooy Kan, A. H. G. (1976). Machine Scheduling Problems: Classification, complexity, and computations, The Hague, Holland: Martinus Nijhoff. Rodammer, F. A., and White, K. P. (1988). A recent survey of production scheduling. IEEE Transactions on Systems, Man and Cybernetics, 18 (6), 841-851. Roth, E. M., and Woods, D. D., (1989). Cognitive task analysis: an approach to knowledge acquisition for intelligent system design. In G. Guida and C. Tasso (Eds.) Topics in Expert System Design: Methodologies and Tools, Amsterdam: Elsevier Science Publishers, pp. 233-264. Rouse, W. B.(1987). Models of human problem solving: detection, diagnosis, and compensation for system failures. In A. P. Sage (Ed.) System Design for Human Interaction, New York: IEEE Press, pp. 278-290. Rouse, W. B. and Morris, N. M. (1986). On looking into the black box: Prospects and limits in the search for mental models. Psychological Bulletin, 100, 349-363.

345

Sadeh, N. (1991). MICRO-BOSS: A Micro-Opportunistic Factory Scheduler, mimeograph, Center for Integrated Manufacturing Decision Systems, The Robotics Institute, Carnegie Mellon University. Sadeh, N. (1992). Personal communication Wed Jan 29. Sadeh, N., and Fox, M. S. (1990). Variable and Value Ordered Heuristics of Activity-Based Job-Shop Scheduling. Proceedings of the Fourth International Conference on Expert Systems in Production and Operations Research, Hilton Head, South Carolina, May 1990. Sadowski, R. P., and Harmonosky, C. M. (1987). Production and project scheduling. In John A. White (Ed.) Production Handbook, 4th Ed., New York: John Wiley & Sons, pp. 3-137—3-158. Sage, A. P. (1981). Behavioral and organizational considerations in the design of information systems and processes for planning and decision support. IEEE Transactions Systems Machines and Cybernetics 11(9) 640-678. Sage, A. P. (1987). Behavioral and organizational considerations in the design of information systems and processes for planning and decision support. In A. P. Sage (Ed.) System Design for Human Interaction, New York: IEEE Press, pp. (55:93). Sanderson, P. M. (1988). Human supervisory control in discrete manufacturing: translating the paradigm. In W. Karwowski (Ed.) Ergonomics of Hybrid Automated Systems I: Proceedings of the First International Conference on Ergonomics of Advanced Manufacturing and Hybrid Systems, Louisville, Kentucky, August 15-18, 1988, Elsevier, pp. 15-22. Sanderson, P. (1989). The human planning and scheduling roles in advanced manufacturing systems: an emerging human factors domain. Human Factors 31 (6), 635-666. Sanderson, P. M. (1991). Towards the model human scheduler. International Journal of Human Factors in Manufacturing, 1(3), 195-219.

346

Sanderson, P. M. (1998), Cognitive work analysis and the analysis, design, and evaluation of human-computer interactive systems. In P. Calder and B. Thomas (Eds.) Proceedings 1998 Australian Computer Human Interaction Conference, OzCHI’98, November 30 –December 4, Adelaide, IEEE, pp. 220-227. Sanderson, P. M. and Harwood, K. (1988). The skills, rules, and knowledge classification: A discussion of its emergence and nature. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis, pp. 21-34. Sanderson, P. M., and Moray, N. (1990). The human factors of scheduling behavior. In W. Karwowski, M. Rahimi (Eds.) Ergonomics of Hybrid Automated Systems II, Amsterdam: Elsevier, pp. 399-406. Schank, R. C., and Abelson, R. B. (1977). Scripts, Plans, Goals, and Understanding, Hillsdale, NJ: Lawrence Erlbaum Associates. Sen, A. and Biswas, G. (1985). Decision support systems. An expert systems approach. Decision Support Systems, 1. Sharit, J. (1984). Human Supervisory Control Of A Flexible Manufacturing System: An Exploratory Investigation, doctoral dissertation, Purdue University. Sharit, J. (1985). Supervisory control of a flexible manufacturing system. Human Factors, 27(1), 47-59. Sharit, J. (1988). Issues in modeling supervisory control in flexible manufacturing systems. In W. Karwowski (Ed.) Ergonomics of Hybrid Automated Systems I: Proceedings of the First International Conference on Ergonomics of Advanced Manufacturing and Hybrid Systems, Louisville, Kentucky, August 15-18, 1988, Elsevier, pp. 3-13.

347

Sharit, J., Eberts, R. and Salvendy, G. (1988). A proposed theoretical framework for design of decision support systems in computer-integrated manufacturing system: A cognitive engineering approach, International Journal of Production Research, 26, 1037-1063. Sharit, J, and Salvendy, G. (1987). A real-time interactive computer model of a flexible manufacturing system. IIE Transactions, 19, 167-177. Sheridan, T. B. (1976). Toward a general model of supervisory control. In T. B. Sheridan and G. Johannsen (Eds.) Monitoring Behavior and Supervisory control, New York: Plenum, pp. 271-281. Sheridan, T. B. (1987). Supervisory control. In G. Salvendy (Ed.) Handbook of Human Factors, New York: Wiley, pp. 1243-1268. Sheridan, T. B. (1988). Human and computer roles in supervisory control and telerobotics: musings about function, language and hierarchy. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, London: Taylor & Francis, pp. 149-160. Sheridan, T. B. and Johannsen, G. (Eds.) (1976). Monitoring Behavior and Supervisory Control, New York: Plenum Press. Sibley, D. (1988). Spatial Applications of Exploratory Data Analysis. Norwich: Geo Books. Simon, H. A. (1955). A behavioral model of rational choice. Quarterly Journal of Economics, 69, 99-118. Simon, H. (1960). The New Science of Management Decision, New York: Harper and Row. Smith, H. T., and Crabtree, R. C. (1975). Interactive planning: a study of computer aiding in the execution of a simulated scheduling task. International Journal of man-Machine Studies, 7, 213-231.

348

Smith, S. F. (1985). A constraint-based framework for reactive management of factory schedules. In M. O. Oliff (Ed.) Intelligent Manufacturing: Proceedings from the first conference on Expert Systems and the Leading Edge in Production Planning and Control, Menlo Park: The Benjamin/Cummings Pub. Co., pp. 113-130. Smith, S. F., and Fox, M. S. (1985). Constructing and Maintaining Detailed Production Plans: Investigations into the Development of KnowledgeBased Factory Scheduling Systems, Carnegie-Mellon University, Pittsburgh, Pennsylvania. Smith, S. F., Ow, P. S., Potvin, J.-Y., Muscettola, N., Matthys, D, C. (1990). An Integrated Framework for Generating and Revising Factory Schedules. Journal of Operational Research Society, 41(6), 539-552. Shneiderman, B. (1982). The future of interactive system and the emergence of direct manipulation. Behavior and Information Technology, 1, 237-256. So, K. C. (1990). Some heuristics for scheduling jobs on parallel machines with setups. Management Science, 36(4), 467-475. Solberg, J. J. (1989). Production planning and scheduling in CIM. In G. X. Ritter (Ed.), Information Processing '89, Elsevier Science Publishers B.V., North Holland, pp. 919-925. Solso, R. L. (1979). Cognitive Psychology, New York: Harcourt Brace Jovonovich Inc. Sorkin, R. D., and Woods, D. D. (1985). Systems with human monitors: A signal detection analysis. Human-Computer Interaction, 1, 49-75. Sprague, R. H. and Carlson, E. D. (1982). Building Decision Support Systems, Englewood Cliff, NJ: Prentice-Hall. Srikar, B. N., and Ghosh, S. (1986). A MILP model for the n-job, m-stage flowshop with sequence dependent set up times. International Journal of Production Research, 24, 1459-1474.

349

Stecke, K. E. and Solberg, J. J. (1977). Scheduling of Operations in a Computerized Manufacturing System, dissertation, School of Industrial Engineering, Purdue University. Sun, D. and Lin, L. (1993). A dynamic job shop scheduling framework: a backward approach. International Journal of Production Research, 32 (4), 967-985. Sutherland, J. W. (1986). Assessing the artificial intelligence contribution to decision technology. IEEE Transactions on Systems, Man and Cybernetics, SMC-16 (1), 3-20. Svestka, J. A. (1984) Dynamic Rescheduling Methods, mimeograph, Cleveland State University. Svestka, J. A. (1988). A real time rescheduler - supplying the missing link. In A. Mital (Ed.) Recent Developments in Production Research: Collection of Refereed Papers Presented at the IXth International Conference on Production Research, Elsevier, 247:253. Tabe, T., Yamamuro, S., and Salvendy, G. (1988). An approach to knowledge elicitation in scheduling FMS: Toward a hybrid intelligent system. In W. Karwowski, H. R. Parsaei and W. R. Wilhem (Eds.) Ergonomics of Hybrid Automated Systems I, Elsevier Science Publishers, Amsterdam, pp. 259266. Tabe, T. and Salvendy, G. (1988). Toward a interactive intelligent system for scheduling and rescheduling of FMS, International Journal of Computer Integrated Manufacturing, 1 (3), 154-164. Tang, C. S. (1990). Scheduling batches on parallel machines with major and minor set-ups. European Journal of Operational Research, 46 28-37. Treisman, A. and Souther, J. (1985). Search asymmetry: a diagnostic for preattentive processing of separable features, Journal of Experimental Psychology: General, 114, pp. 285-310.

350

Tsubone, H. (1988) An analytical model for hierarchical production planning. In A. Mital (Ed.) Recent Developments in Production Research: Collection of Refereed Papers Presented at the IXth International Conference on Production Research, Elsevier, pp. 216-222. Tukey, J. W. (1977). Exploratory Data Analysis. Addison-Wesley. Tufte, E. R. (1983). The Visual Display of Quantitative Information. Cheshire, Connecticut: Graphics Press. Leung, Y., and Apperley, M. D. (1993). E3: towards the metrication of graphical techniques for large data sets. In L. J. Bass, J. Gornostaev, and C. Unger (Eds) Selected papers of the third international conference on HumanComputer Interaction (EWHCI'93) Moscow, August 1993, Berlin: Springer-Verlag, 125-140. Ullman, J. D. (1976). Complexity of sequencing problems. In E. G. Coffman (Ed.) Computer and Job/Shop Scheduling Theory, New York: John Wiley, pp. 139-164. Umbers, I. G. (1979). Models of the process operator. International Journal of Man-Machine Systems, 11, 263-284. Vicente, K. J. (1990). A few implications of an ecological approach to human factors. Human Factors Society Bulletin, 33 (11), 1-4. Vicente, K. (1991). Supporting Knowledge-Based Behavior through Ecological Interface Design, University of Illinois at Urbana-Champaign, EPRL-9101. Vicente, K. (1997). Personal communication, 30 April 1997. Vicente, K. J. (1999). Cognitive Work Analysis: Towards Safe, Productive, and Healthy Computer-based Work, Hillsdale, NJ: Lawrence Erlbaum Associates.

351

Vicente, K. J., Moray, N., Lee, J. D., Rasmussen, J., Jones, B. G., Brock, R. and Djemil, T. (1996). Evaluation of a Rankine cycle display for nuclear power plant monitoring and diagnosis. Human Factors, 38 (3), 506-521. Vicente, K. J. and Rasmussen, J. (1990) The ecology of human-machine systems ii: mediating “direct perception” in complex work domains. EPRL-90-91, University of Illinois at Urbana-Champaign, USA. Vicente, K. J. and Rasmussen, J. (1992). Ecological interface design: theoretical foundations. IEEE Transactions on Systems, Man and Cybernetics, SMC22, 589-606. Volpert, W. (1982). The model of the hierarchical-sequential organization of action. In Hacker, W., Volpert, W., and von Cranach, M., (Eds.) Cognitive and Motivational Aspects of Action. North Holland, Amsterdam, pp. 3551. Weber, M. (1962). Basic Concepts in Sociology, trans. with intro. H. P. Secher, New Jersey: The Citadel Press. Whitefield, A., (1985). Constructing and applying a model of the user for computer system development: the case for Computer-Aided-Design, PhD thesis, University College, London. Wickens, C. D. (1987). Attention. In P Hancock (Ed.) Human Factors in Psychology Amsterdam: Elsevier (North-Holland), pp. 29-80. Wiers, V. C. S. (1996). A quantitative field study of the decision behaviour of four shop floor schedulers. Production Planning and Control, 7 (4), 383392. Wiers, V. C. S. (1997). Human computer interaction in production scheduling: Analysis and design of decision support systems for production scheduling tasks, PhD Thesis Eindhoven University of Technology, The Netherlands.

352

Wiers V. C. S. and McKay, K.N. (1996). Task allocation: human computer interaction in intelligent scheduling. Proceedings of the 15th Workshop of the UK Planning & Scheduling Special Interest Group, November 1996, Liverpool (UK) pp.333-344. Williams, D. N. (1993). Machine Scheduling Problems with Setup times, unpublished MEngSc thesis, The University of Melbourne. Wilson, J. R., and Rutherford, A. (1989). Mental models: Theory and application in human factors. Human Factors, 31, 617-634. Wittrock, R. J. (1990). Scheduling parallel machines with major and minor setup times. The International Journal of Flexible Manufacturing Systems, 2, 329-341. Wood, D. J., Shotter, J. D., and Godden, D. (1974). An investigation of the relationship between problem-solving strategies, representation and memory. Quarterly Journal of Experimental Psychology, 26, 252-257. Woods, D. D. (1986). Paradigms for intelligent decision support. In E. Hollnagel, G. Mancini and D. D. Woods (Eds.) Intelligent Decision Support in Process Environments Springer-Verlag, pp. 153-173. Woods, D. D. (1991). The cognitive engineering of problem representations. In J Alty & G. Weir (Eds.), Human-computer interaction in complex systems (pp 169-188). London: Academic. Woods, D. D. (1988). Coping with complexity: the psychology of human behaviour in complex systems. In L. P. Goodstein, H. B. Andersen and S. E. Olsen (Eds.) Tasks, Errors, and Mental Models: a festschrift to celebrate the 60th birthday of Professor Jens Rasmussen, Taylor & Francis, London, pp. 128-148.

353

Woods, D. D. (1995). Toward a theoretical base for representation design in the computer medium: ecological perception and aiding human cognition. In J. M. Flach, P. Hancock, J. Caird, and K. J. Vicente (Eds.). Global Perspectives on the Ecology of Human-Machine Systems, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 157-188. Woods, D. D, and Roth, E. M., (1988). Cognitive systems engineering. In M. Helander (Ed.) Handbook of Human-Computer Interaction, Amsterdam: Elsevier Science Publishers, pp. 3-43. Wulf, V., and Rohde, M. (1998).Towards an integrated organization and technology development. [Online] Available http://www.iig.unifreiburg.de/modell/tele_ueb/acm-dis.html [Accessed 10 July 1998]. Yang, K. K., and Sum, C. C. (1993). A comparison of job shop dispatching rules using a total cost criterion. International Journal of Production Research, 32 (4), 807-820. Young, R. M. (1983). Surrogates and mappings: two kinds of conceptual models for interactive devices. In D. Gentner & A. L. Stevens (Eds.) Mental Models, Hillsdale, NJ: Lawrence Erlbaum Associates, pp. 35-52. Zuboff, S. (1988). In the Age of the Smart Machine: The Future of Work and Power, New York: Basic Books.

354

355