Quantitative models for performance measurement ... - Semantic Scholar

5 downloads 21 Views 312KB Size Report
An application of the method to model &total production cost per unit' of a collaborator company is ...... support models, European Journal of Operational Re-.

Int. J. Production Economics 64 (2000) 231}241

Quantitative models for performance measurement system P. Suwignjo*, U.S Bititci, A.S Carrie Centre for Strategic Manufacturing, University of Strathclyde, 75 Montrose Street, Glasgow G1 IXJ, UK

Abstract This paper describes works at the Centre for Strategic Manufacturing, University of Strathclyde on developing Quantitative Models for Performance Measurement Systems (QMPMS) using cognitive maps, cause and e!ect diagrams, tree diagrams, and the analytic hierarchy process. It describes how the technique can be used to identify factors a!ecting performance and their relationships, structure them hierarchically, quantify the e!ect of the factors on performance, and express them quantitatively. A simple example is used throughout the paper to explain how the concept of the model works. An application of the method to model &total production cost per unit' of a collaborator company is presented. Then taxonomy of performance measurement is outlined. The taxonomy, which is developed based on the QMPMS, can be used to prioritise performance measurement. ( 2000 Elsevier Science B.V. All rights reserved. Keywords: AHP; Performance measurement; Prioritisation of performance measurement

1. Introduction The changing of the nature of competition in world markets during the last two decades has had a great impact on the environment, externally and internally, within which most companies operate. Customers are becoming more critical about quality and customer services. Quality, speed, and #exibility, in addition to cost, have emerged as the three most important competitive attributes [1}4]. Companies failing to respond to these changes have su!ered by losing market share. In order to survive in this new environment and to regain competitive edge, companies adopted new philosophies and technologies such as Business Process Re-engineering (BPR), Total Quality Man* Corresponding author. Tel:#44-141 548 2015; fax:#44141 552 0557. E-mail address: [email protected] (P. Suwignjo)

agement (TQM), Just In Time (JIT), Computer Integrated Manufacturing (CIM), and Flexible Manufacturing System (FMS) [5]. These internal changes have shifted the management focus to strategies that include quality, #exibility, shorter lead-time, delivery reliability as well as cost. Many researchers have shown that the traditional "nancially based performance measurement systems have failed to measure and integrate all the factors critical to success of a business [6}11]. To deal with the new environment, new performance measurement systems have been proposed, such as the Activity Based Costing System [12}15], the Balanced Scorecard [16], the SMART System [17], and the Performance Measurement Questionnaire [18]. Some researchers, rather than providing general frameworks of performance measurement system design, preferred proposing criteria for the design of performance measurement systems [11,19]. Extensive, continuing research carried out

0925-5273/00/$ - see front matter ( 2000 Elsevier Science B.V. All rights reserved. PII: S 0 9 2 5 - 5 2 7 3 ( 9 9 ) 0 0 0 6 1 - 4

232

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

at the Centre for Strategic Manufacturing (CSM), University of Strathclyde, has developed a Reference Model and Audit Method to assess the robustness and integrity of performance measurement systems used [20]. Despite the availability of the various approaches to develop performance measurement systems, none of them attempted to quantify the e!ects of the various factors on performance, except recent research by Rangone [21] which proposed the use of Analytic Hierarchy Process to compare performance of factories in supporting manufacturing strategy. Moreover, audits conducted by the researchers at the CSM based on the Integrated Performance Measurement System (IPMS) Reference Model demonstrates that most companies use both "nancial and non-"nancial performance measures. However, they do not attempt to structure these measures in logical manner to understand and manage the relationships between measures. Maskell [11] proposed seven common characteristics for the design of performance measurement systems, three of which are relevant to present discussion. These are: f performance measures should relate directly to manufacturing strategy, f performance measures should vary between companies, f performance measures should change over time. Implementing a performance measurement system compatible with those criteria usually produces unmanageable performance reports. As reported by Neely et al. [22] one of the research issues in the future research agenda on performance measurement would be developing a technique to reduce the number of measures into a manageable set. In response to the issues identi"ed in the above paragraphs this paper presents a model for: f identifying factors a!ecting performance and their relationships, f structuring them hierarchically, f quantifying the e!ect of the factors on performance. A simple example is used to show how the model works. Then, the application of the model to a col-

laborator company is discussed. In the second part of the paper a model for prioritising performance measurement is outlined.

2. Quantitative model for performance measurement system Heim and Compton [23] quoted Lord Kelvin's proposition as follows: When you can measure what you are speaking about and express it in number you know something about it 2, (otherwise) your knowledge is a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in thought advanced to the stage of science. Performance measurement systems usually involve a number of multidimensional performance measures [24]. Rangone [21] pointed out that a problem, which arises from that situation is the integration of those several measures expressed in heterogeneous units into a single unit. The Quantitative Model for Performance Measurement System (QMPMS) developed in this research uses the Analytic Hierarchy Process to quantify the e!ects of factors on performance [25]. There are three main steps in QMPMS: 1. identi"cation of factors a!ecting performance and their relationships, 2. structuring the factors hierarchically, 3. quantifying the e!ect of the factors on performance. 2.1. Identixcation of factors awecting performance and their relationships Factor identi"cation is the most crucial step in QMPMS implementation. Failing to include all the factors a!ecting performance and identifying their relationships in the implementation certainly will cause deterioration of the results. In order to explore and identify factors a!ecting performance and their relationships, the QMPMS uses cognitive maps. Performance measurement usually involves a number of factors and persons from di!erent

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

departments. The factors to be measured are often not well de"ned from the start and uncovering what are the real factors to be measured and their relationships is not always an easy task. Di!erent individuals will interpret a particular situation differently depending on their mental framework and political concerns [26]. The cognitive map seems to be an e!ective tool in helping to identify the factors a!ecting performance and their relationships. Eden et al. [26] de"ne cognitive mapping as `a modelling technique which intends to portray ideas, beliefs, values and attitudes and their relationship one to another in a form which is amenable to study and analysisa. To give a simple example, suppose a person wanted to move to another country. He might choose the country where he could maximise his wealth in terms of his amount of money in the bank. He could use a cognitive map to identify factors a!ecting amount of his money in the bank. Fig. 1(a) shows the cognitive map produced. An arrow indicates the e!ect of factor on performance or on other factors. In general, the e!ect of a factor on performance as indicated by Fig. 1 can be classi"ed into: f direct (vertical) e!ect, f indirect (horizontal) e!ect, f self-interaction e!ect.

233

2.1.1. Direct ewect Direct e!ect of a factor on performance is an aggregate of all the e!ects of factors on performance through that factor. Direct e!ect of a factor is made up of its inherent e!ect, its self-interaction e!ect and indirect e!ects of other factors through that factor. As indicated by Fig. 1 there are three factors that directly a!ect &amount of money in the bank'. The factors are &initial deposit', &interest', and &savings paid in'. Positive signs at the end of the arrows indicate positive e!ects of the factors. 2.1.2. Indirect ewect In addition to direct or vertical e!ect, factors &initial deposit' and &savings paid in' in Fig. 1 also indirectly a!ect &amount of money in the bank' through their e!ects on other factors within the same level (e.g. &interest'). Indirect e!ect is the e!ect of a factor on performance through other factors within the same level. 2.1.3. Self-interaction ewect &Interest' indirectly a!ects &amount of money in the bank' through its self-interaction. Self-interaction e!ect is the e!ect of a factor on its self. Factor &interest' in Fig. 1 has self-interaction e!ect since the amount of money received from interest this month will increase the amount of interest received next month.

Fig. 1. (a) Cognitive map; (b) cause and e!ect diagram; and (c) tree diagram of &amount of money in the bank'.

234

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

2.2. Structuring the factors hierarchically In the "rst step, the main concern is merely to elaborate the factors a!ecting performance and their relationships. No attempt is made to classify factors in the same level into one group. As a result, the hierarchical structure of the factors is not clear. Cause and e!ect diagrams can be used to identify the hierarchical structure of the factors. A factor is a member of level 0 if this factor is a!ected by the other factors but it does not a!ect other factors. While factors which directly a!ect the factors at a particular level will be the members of the next lower level. A tree diagram then can be used to give a clearer picture of the hierarchical structure. For the money in the bank problem the cause and e!ect diagram and the tree diagram can be seen in Figs. 1(b) and (c). In this tree diagram, the number above each factor indicates the level and the factor's number within that level. For example, the number 1.2 above &interest' indicates that interest is factor number 2 in level 1. While numbers below the factors indicate interaction relationships of the factors. For example, the numbers 1.1, 1.2, 1.3 below &interest' indicate that besides a!ecting &amount of money in the bank' directly (vertically), &interest' also indirectly a!ects &amount of money in the bank' through its interaction with other factors. 2.3. Quantifying the ewect of the factors on performance Finally the relative e!ects of the factors (direct, indirect, and combined) can be quanti"ed using standard Analytic Hierarchy Process (AHP) procedures [27}29]. The quanti"cation process is carried out based on the results of pair-wise comparison among the factors. For each pair of the factors from a particular level, their e!ect on the factor from the next higher level (direct e!ect) or on the factor within the same cluster (indirect e!ect) is compared. A score lying between one (equals important) and nine (absolutely more important) will be assigned for each comparison depend on the subjective judgement of the analyst. The result is a pair-wise comparison matrix. The relative e!ects

of the factors on performance can be generated by normalising the eigen vector associated with the maximum eigen value of the pair-wise comparison matrix [27]. The pair-wise comparison questionnaire and the pair-wise comparison matrix of &amount of money in the bank' problem described earlier is indicated in Fig. 2. The question posed to that person would be: &Comparing factor &A' to &B', which one does have stronger e!ect on &amount of money in the bank?', and &How strong?'. Suppose in answering this type of question the person believes that factor &deposit' has stronger e!ect on &amount of money in the bank' compared to that of factor &interest' and the score in the pair-wise comparison matrix is 5. Based on the result of pair-wise comparison questionnaire in Fig. 2(a) the pairwise comparison matrix of &amount of money in the bank' can be constructed as indicated in Fig. 2(b). Using the ExpertChoice [30] software the relative e!ect of factors &initial deposit', &interest', and &savings paid in' on &amount of money in the bank' are 0.212, 0.062, and 0.726, respectively. The e!ect of factor &savings paid in' on &amount of money in the bank' is roughly 3.5 times stronger than the e!ect of &initial deposit'. While the e!ect of &initial deposit' on &amount of money in the bank' is roughly 3.5 times stronger than the e!ect of &interest'. The combined e!ects of the factors on performance can be computed by decomposing the direct e!ects of the factors into two clusters. For example, in Fig. 3 the direct e!ect of factor A on performance can be decomposed into cluster I and cluster II. Cluster I consists of inherent e!ect of factor A. Cluster II consists of self-interaction e!ect of factor A and horizontal e!ects of other factors through factor A. The interaction e!ect of factor A on performance is made up from the selfinteraction e!ect of factor A and the horizontal e!ects of factor A through other factors within the same level. Finally, the combined e!ect of factor A on performance is made up from the inherent e!ect of factor A and the interaction e!ect of factor A. The next section of the paper presents a practical application of QMPMS in a collaborating company.

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

235

Fig. 2. (a) Pairwise comparison questionnaire; (b) pairwise comparison matrix.

Fig. 3. The interaction model of QMPMS.

3. The practical application of the QMPMS The QMPMS has been applied in various collaborator companies. In this section an application of the QMPMS to model &total production cost per unit' at &S (UK) Ltd.' is presented. The case will show how the QMPMS was used to identify factors

a!ecting performance and their relationships, and quantify the e!ects of the factors on &total production cost per unit'. &S (UK) Ltd.' is part of the &S' Corporation that specialises in the manufacture and distribution of disk and tape drives for electronics industry. &S (UK) Ltd.' con"gures Company's products

236

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

to customer speci"cation before delivery to customer. First, the research team conducted a presentation to explain the concept of the QMPMS model to the management team of the Company. It is very critical in this step to make clear to the management the concept of the pairwise comparison questionnaires used by the model, which ask, &Comparing factor A to B, which one does have stronger e!ect on performance?' and &How strong?'. At the end of the presentation the research team asked to the management team to start thinking about the problem that might be selected as the case study. The research team and the management team discussed the selection of a case study. They agreed to select &total production cost per unit' because of the availability of reliable data. The research team then facilitated, using cognitive mapping methods, the management team to identify factors a!ecting &total production cost per unit' and their relationships. It is important to note that in this step the research team acted as a facilitator to assist the management team to develop their own cognitive model. The result is indicated in Fig. 4. The meeting was continued to elicit the manager's judgement on the e!ects of factor on &total production cost per unit'. The AHP software available in the market, ExpertChoice [30], was used for this purpose. The direct e!ect of the factors on &total production cost per unit' is summarised in Table 1.

Furthermore, the managers believes that in their Company 5% of the direct e!ect of &material and supplies cost per unit' on &variable cost per unit' is contributed by factor &volume'. They also believe that 10% of the direct e!ect of &people related cost per unit' on &variable cost per unit' is contributed by factor &volume'. Consequently the combined e!ect of factor &volume' on &total production cost per unit' would be greater than that of listed in Table 1, and the combined e!ects of factors &material and supplies cost per unit' and &People related cost per unit' are lower than that listed in Table 1. Fig. 5 shows the computation of the combined e!ect of factor &volume' using the framework indicated in Fig. 3. As indicated by the cognitive map in Fig. 4, there is no other factor that has indirect e!ect through factor &volume'. Consequently, all of the direct e!ect of factor &volume' (0.205) is contributed by inherent e!ect. Factor &volume' also does not have self-interaction e!ect. However, factor &volume' has indirect e!ect on &variable cost per unit' through factors &material and supplies cost per unit' (0.05]0.072) and &people related cost' (0.10]0.365). The same procedure was used to compute the combined e!ect of the other factors. The result is summarised in Table 2.

4. Prioritisation of performance measures The output of the QMPMS is the quanti"cation of the relative e!ects of several factors on

Fig. 4. Factors a!ecting &total production cost per unit' and their relationships.

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

237

Fig. 5. The computation of combined e!ect of factor &volume'.

Table 1 Direct e!ect of factors on &total production cost per unit' Factor

Direct e!ect

Fixed cost per unit Variable cost per unit Fixed cost Volume Material and supplies cost per unit People related cost per unit % Labour utilisation Overtime

0.500 0.500 0.137 0.205 0.072 0.365 0.006 0.359

Table 2 Combined e!ect of factors a!ecting &total production cost per unit' Factor

Direct e!ect

Combined e!ect

Fixed cost per unit Variable cost per unit Fixed cost Volume Material and supplies cost per unit People related cost per unit Utilisation Overtime

0.500 0.500 0.137 0.205 0.072 0.365 0.006 0.359

0.500 0.500 0.137 0.245 0.068 0.329 0.006 0.323

performance. In order to reduce number of performance measures Pareto analysis can be used to group these factors into classes A, B and C based on their e!ects on performance. Factors that have the greatest impact on performance and contribute over 75% of the performance may be classi"ed as A. Factors that have the lowest e!ect on performance and just contribute over 5% of performance may be classi"ed as C. The rest of the factors may be classi"ed as B. However, this ABC prioritisation approach is not su$ciently rigorous to minimise the number of performance measures, because rate of change of factors varies dramatically. Some factors may change day-to-day, others month-to-month or year-to-year. Therefore, in addition to ABC classi"cation it is important to establish the rate of change in each factor. The general principle being: factors having the greatest impact and that change most rapidly should be monitored most frequently, while those of less importance or more stables can be monitored less often. Based on the principles of ABC classi"cation and rate of change classi"cation taxonomy of performance measures has been developed as indicated in Fig. 6. This taxonomy can be used as a general guideline in classifying the performance measures

238

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

f All above points combined assists towards understanding the dynamic behaviour of factors a!ecting performance. f Facilitates the reduction of the number of performance measurement reports. 5.2. Subjectivity vs. objectivity of approach

Fig. 6. The taxonomy and monitoring frequency determination of performance measures.

into categories &critical', &intermediate', and &minor'. As indicated in Fig. 6, the &intermediate' and &minor' categories are classi"ed further into &intermediate I', &intermediate II', and &minor I', &minor II'. This classi"cation may be used to determine the frequency of performance measurement where critical factors are monitored more frequently, intermediate factors are monitored less frequently, and the rest of the factors may be monitored seldom. The time-frame used in this taxonomy will vary between companies. Companies producing capital goods will use longer time-frame compared to that used by companies producing fast-moving consumer goods.

5. Discussion and conclusion 5.1. Achievements and benexts An approach for quantifying the relationships between various factors e!ecting performance has been developed and demonstrated. The bene"ts of the QMPMS approach may be summarised as follows: f Factors a!ecting performance can be identi"ed and then their e!ects can be quanti"ed. f E!ects of multidimensional factors on performance can be aggregated into a single dimensionless unit (priority). f Help managers quantify the level of impact of each factor on overall performance and therefore assists in focusing improvement activities. f The relationships between factors can be clearly identi"ed and expressed in quantitative terms.

People may feel that the technique used in the QMPMS model (AHP) for quantifying the e!ects of factors on performance is very intuitive, subjective, and very di$cult to be used in practices. However, through careful explanation of the concept of the approach the authors have found that people can understand and implement it with little di$culty. In a performance measurement system a large number of multidimensional factors can a!ect performance. Integrating those multidimensional effects into a single unit can only be done through subjective, individual or group judgement. It is impossible to have objective measurement and scale system for each di!erent dimension of measurement that can facilitate objective value trade o! between di!erent measures. For example, how can we develop an objective measurement and scale to measure management commitment? How can we quantify objectively value trade-o! between management commitment and percent of reject? Those types of measurement de"nitely will not exist. In fact subjective measurement is the only concept that widely accepted in Multi-Criteria Decision Analysis to deal with multi-criteria problems. Since the QMPMS uses subjective measurement, the results may not be very accurate. However, this problem can be overcome by using group judgement rather than individual judgement. This will reduce the subjectivity of the judgement. The accuracy of the QMPMS can also be improved through experiences. Saaty suggests that two good practical experiences of using AHP usually enable people to provide accurate judgement. 5.3. The research paradigm Having accepted the above argument that subjective measurement is the only method available to deal with this type of multi-criteria problems,

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

some may argue that the adoption of the AHP assumes that the problem lies in a deterministic paradigm rather than in a stochastic or even fuzzy paradigm. In tackling this problem the researchers have considered the use of techniques which may be more suited to stochastic and fuzzy paradigms, such as the Utility Theory and the Fuzzy Theory. However, at the outset of the research the objective was to develop a approach which is simple enough that it could be easily understood and applied by managers without academic expertise. Therefore the researchers selected the simplest tool which would provide accurate models in the context of a performance measurement system. Experiments [31,32] compared AHP with "ve other approaches, including the Utility approach and multiple regression approach. The result showed that AHP is the least di$cult and most trustworthy. 5.4. Practical issues Some potential problems might be encountered in applying QMPMS method. First, relates to managers' hesitation in "lling in the pairwise comparison questionnaires, particularly if the model is applied to model performance improvement. Performance improvement usually involves identi"cation and quanti"cation of a large number of factors a!ecting performance. Consequently, the number of pairwise comparison questionnaire will be enormous. Filling in all the questionnaires will be exhausting and time consuming. However, this problem can be minimised through three approaches. f First important point is that the users, i.e. the management team, must be involved in the whole process. In the case study example presented in this paper, the researchers facilitated the management team to build a model of their performance measurement system using the cognitive mapping technique. This in turn heightened the team's awareness of the interaction between various factors e!ecting performance in their Company. This increased awareness was a great help when the team was completing the pairwise comparison questionnaires, as they understood the context of the questionnaire.

239

f Secondly, through decomposing the model into several smaller models where smaller models are then distributed to groups of people for completing only a sub-set of the questionnaire. f The use of the interactive software, such as ExpertChoice, makes the implementation of the model much easier. In fact the QMPMS model is now being implemented at tax o$ce to prioritise 130 performance measures. The second problem of the QMPMS application relates to get a single judgement in pairwise comparison if more than one person involves in "lling in the questionnaires. Even though Saaty [28] provides formula for computing the priority of group judgement using geometric mean, getting consensus among the members of the group in "lling the score in the pairwise comparison matrices seems better than just using the geometric mean. Several discussions may be required to elaborate the real situation before a general consensus of the judgement of a particular problem can be achieved. Cognitive mapping is also an e!ective tool which could be used to elaborate the problem. Finally, people who uses AHP must be aware of the &rank reversal' phenomenon for which the AHP is critised. The rank reversal phenomenon is the changes of the rank of the factors if new factors are added to or removed from the existing model [32}38]. Saaty suggests using AHP with feedback to avoid this phenomenon [28]. 5.5. The dynamics of performance measurement systems Companies operate in dynamic environment, consequently performance measurement is a dynamic process. This means that the performance measures will change over time and vary between companies. At a particular company, performance measures that are critical today and, after a period of time it could change and become less important. The changes could be the results of internal performance improvement programmes, or because of changes in the external environment of the company. The dynamic nature of the internal and external environment implies that the performance

240

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241

measurement system is dynamic and it changes as the internal and/or external environment changes. Therefore, the quantitative model, which emerges from the QMPMS approach, has a life cycle and is only valid as long as the internal and external environment remains stable. Any signi"cant change in the external or internal environment of the company may invalidate the model. Thus it is important to recognise these changes as soon as possible so that the quantitative basis of the model is rede"ned to re#ect the true picture. Future research will be targeted to deal with the dynamic nature of the performance measurement system by building in capabilities to make it selfauditing and self-adjusting.

References [1] D.A. Garvin, Competing on the eight dimensions of quality, Harvard Business Review (November}December) (1987) 101}109. [2] G. Stalk, Time } the next source of competitive advantage, Harvard Business Review (July}August) (1988) 41}51. [3] D. Gerwin, An agenda of research on #exibility of manufacturing processes, International Journal of Operation Production Management 7 (1) (1987) 38}49. [4] N. Slack, The #exibility of manufacturing system, International Journal of Operation Production Management 7 (4) (1987) 35}45. [5] A.M. Ghalayini, The changing basis of performance measurement, International Journal of Operation Production Management 16 (8) (1996) 63}80. [6] R.S. Kaplan, Measuring manufacturing performance: A new challenge for manageria, accounting research, The Accounting Review 58 (4) (1983) 686}705. [7] R.H. Hayes, S.C. Wheelwright, K.B. Clark, Dynamic Manufacturing: Creating the Learning Organization, Free Press, New York, 1988. [8] R.S. Kaplan, Yesterday's accounting undermines production, Harvard Business Review 62 (July}August) (1984) 95}101. [9] R.G. Eccles, Performance measurement manifesto, Harvard Business Review 69 (January}February) (1991) 131}137. [10] J. Fisher, Use of non-"nancial performance measures, Journal of Cost Management 6 (1992) 31}38. [11] B.H. Maskell, Performance Measurement for World Class Manufacturing: A Model for American Companies, Productivity Press, Cambridge, 1992. [12] Cooper, R. The rise of activity-based cost systems: Part I } What is an activity-based cost system? Journal of Cost Management (Summer) (1988) 45}54.

[13] Cooper, R. The rise of activity-based cost systems: Part II } When do I need an activity-based cost system? Journal of Cost Management (Fall) (1988) 41}48. [14] R. Cooper, The rise of activity-based cost systems: Part III } How many cost drivers do you need and how you select them? Journal of Cost Management (Winter) (1988) 34}46. [15] R. Cooper, The rise of activity-based cost systems: part IV } What do activity-based cost system look like? Journal of Cost Management (Spring) (1989) 34}46. [16] R.S. Kaplan, D.P. Norton, Translating Strategic into Action } The Balanced Scorecard, Harvard Business School Press, Boston, 1996. [17] K.F., Cross, R.L., Lynch, The SMART way to de"ne and sustain success, National Productivity Review 8 (1) (1988}1989) 23}33. [18] J.R. Dixon, A.J. Nanni, T.E. Vollman, The New Performance Challenge: Measuring Operations for World Class Competition, Dow Jones-Irwin, Homewood, IL, 1990. [19] S. Globerson, Issues in developing a performance criteria system for an organisation, Internation Journal of Production Research 23 (4) (1985) 639}646. [20] U.S. Bititci, A.S. Carrie, McDevitt, T. Turner. Integrated performance measurement systems: A reference model, Proceedings of IFIP } WG 5.7 1997 Working Conference, Ascona-Ticino-Switzerland, 1997, pp. 15}18. [21] A. Rangone, An analytical hierarchy process framework for comparing the overall performance of manufacturing departments, Internation Jounral of Operation Production Management 16 (8) (1996) 104}119. [22] A. Neely, M. Gregory, K. Platts, Performance measurement system design: A literature review and research agenda, International Journal of Operations and Production Management 15 (4) (1995) 80}116. [23] J.A. Heim, W.D. Compton (Eds.), Manufacturing Systems: Foundations of World-Class Practice, National Academy of Engineering, Washington, 1992. [24] A.D. Neely, J.R. Wilson, Measuring product goal congruence: An exploratory study, International Journal of Operations and Production Management 12 (4) (1992) 45}52. [25] P. Suwignjo, U.S. Bititci, A.S. Carrie, Quantitative model for performance measurement system: Analytic hierarchy process approach Proceeding of MESELA'97 Conference, Loughborough, 1997, pp. 237}243. [26] C. Eden, S. Jones, D. Sims, Messing About in Problems: An Informal Structured Approach to their Identi"cation and Management, Pergamon Press, New York, 1983. [27] T.L. Saaty, The Analytic Hierarchy Process, McGrawHill, New York, 1980. [28] T.L. Saaty, Fundamentals of Decision Making and Priority Theory}With The Analytic Hierarchy Process, RWS Publications, Pittsburgh, PA, 1994. [29] T.L. Saaty, Decision Making with Dependence And Feedback } The Analytic Network Process, RWS Publications, Pittsburgh, PA, 1996. [30] ExpertChoice. EcPro for Windows Decision Support Software User manual, ExpertChoice Inc., Pittsburgh, PA, 1995.

P. Suwignjo et al. / Int. J. Production Economics 64 (2000) 231}241 [31] P.J.H. Schomaker, C.C. Waid, An experimental comparison of di!erent approaches to determining weights in additive utility models, Management Science 28 (2) (1992) 182}195. [32] E.G. Zaperto, C.H. Smith, H.R. Weistro!er, Evaluating multi-attribute decision support systems, Journal of Multi-Criteria Decision Analysis 6 (1997) 201}214. [33] V. Belton, T. Gear, On a Short-coming of Saaty's Method of Analytic Hierarchies, Omega 11 (3) (1982) 226}230. [34] T.L. Saaty, L.G. Vargas, The Legitimacy of Rank Reversal, Omega 12 (5) (1984) 513}516.

241

[35] J.S. Dyer, Remarks On The Analytic Hierarchy Process, Management Science 36 (3) (1990) 249}258. [36] P.T. Harker, L.G. Vargas, Reply to &Remarks on The Analytic Hierarchy Process' by J.S. Dyer, Management Science 36 (3) (1990) 269}273. [37] S. Schenkerman, Avoiding rank reversal in AHP decision support models, European Journal of Operational Research 74 (1994) 407}419. [38] L.G. Vargas, Reply to Schenkerman's &Avoiding rank reversal in AHP decision support models', European Journal of Operational Research 74 (1994) 420}425.

Suggest Documents