A computational geometry approach for ... - Ingenta Connect

91 downloads 144 Views 500KB Size Report
Fairleigh Dickinson University, Teaneck, New Jersey, USA, and. Joseph Sarkis. Clark University, Worcester, Massachusetts, USA. Keywords Benchmarking ...
The current issue and full text archive of this journal is available at http://www.emerald-library.com/ft

IJOPM 21,1/2

A computational geometry approach for benchmarking Srinivas Talluri

210

Fairleigh Dickinson University, Teaneck, New Jersey, USA, and

Joseph Sarkis

Clark University, Worcester, Massachusetts, USA Keywords Benchmarking, Modelling, Performance measurement, Mathematical programming Abstract Benchmarking involves the identification of best practices for improvement. It assists firms in reengineering their processes in order to achieve higher productivity. Although several methods have been proposed for benchmarking purposes, research in the development of analytical benchmarking tools is limited. This paper proposes an effective benchmarking technique based on computational geometry models. We develop and illustrate benchmarking for two and three-dimensional cases, which involve up to three performance measures, and subsequently address the procedure for the n-dimensional case. The results identify the improvements necessary across various dimensions of business processes in order to achieve higher efficiency. We also develop mathematical programming models that optimally allocate resources in order to attain these improvements.

International Journal of Operations & Production Management, Vol. 21 No. 1/2, 2001, pp. 210-222. # MCB University Press, 0144-3577

Introduction The concept of benchmarking has proven very effective to those organizations that seek to continuously improve their operations and have introduced the total quality management (TQM) philosophy into their business practices. Benchmarking has effectively been used to ``reengineer'' organizations, as well. These business improvement programs rely on effective development, integration and analysis of measurements. The old axiom here is ``you can't manage, what you can't measure.'' But once the measurements are gathered, the tools necessary to evaluate them should provide some benefit to managers. Some analysis tools do exist for this evaluation, but additional tools can provide additional power and insight for managing the organization and aiding decision makers. Benchmarking, in most of the literature, has focused on the management and methodology for its execution. Benchmarking has been introduced by practitioners for improving their corporate competitive advantage. The theoretical or academic literature has lagged practitioners in advancing the knowledge on these topics. This is especially true in the development and application of models and tools that can aid in the design and analysis of benchmarking and improvement efforts. In this paper, an analytical methodology for ``gap analysis'' will be applied to benchmarking measures and data. The modeling effort uses computational geometric modeling and goal programming for evaluation purposes. The underlying concepts supporting these models are rather straightforward. The models help to identify benchmarks that are best to pursue for an organization to help build its competitiveness.

The paper begins with a review of benchmarking and tools that aid in the A computational evaluation of benchmarking measures. A proposed model is then presented. geometry This model is based on completion of some form of ``gap'' analysis and then approach determining the best improvement direction for the company. We develop and illustrate benchmarking for two and three-dimensional cases, which involve up to three performance measures, and subsequently address the procedure for the n-dimensional case. A simple illustrative example is then provided for the two211 dimensional case. This example will help provide some insights into the procedure. Finally, we conclude with some discussion and future directions related to this benchmarking modeling and application effort. Benchmarking The concept of benchmarking has gained much popularity within the last five years, especially since it is explicitly included within the Malcolm Baldrige Award criteria. Benchmarking practices began experiencing growth in the early 1980s when organizations such as Xerox began to more effectively compete and regain market share from its international competitors through their benchmarking efforts. It is a technique that was popularized among Japanese industry members and diffused through competitive international business circles. Benchmarking is still not well defined, where over 42 definitions were found by one source (Heib and Daneva, 1995). The original meaning of the word benchmark refers to a metric unit on a scale for measurement. From a managerial perspective, benchmarking has been defined as a continuous, systematic process for evaluating the products, services, and work processes of organizations that are recognized as representing best practices, for the purpose of organizational improvement. The benchmarking process has a number of levels that can be used in the analysis of an organization. These include (Camp, 1989): . Internal benchmarking ± benchmarking against internal operations or standards, usually in a multi-division or multinational enterprise. . Industry (or competitive) benchmarking ± benchmarking against other companies in the same industry, whether they are direct competitors or not. . Process (or generic) benchmarking ± benchmarking generic processes (e.g. order receipt and dispatch process) against best operations or leaders in any industry. Camp (1995), a former Xerox benchmarking champion, separated the industry benchmarking category (2) into ``competitive'' and ``functional'' benchmarking, with direct competitor benchmarking fitting under the former category, and those from other industries within the latter. With each of these the levels of analysis differ. The tools to evaluate each of these levels can be the same, with a simple alteration of the level of analysis. The literature on benchmarking methodologies tends to support the standard four phase continuous improvement cycle approach (also defined as the Deming cycle) of plan, do, check, act (PDCA). One such process used by the

IJOPM 21,1/2

212

International Benchmarking Clearinghouse (IBC) includes these major phases (a number of techniques similar to this approach may use more phases and sub-phases), in the form of plan, collect data, analyze, and adapt and improve. . Planning includes the preparation of the benchmarking study plan, selection of teams, partner selection, and process analysis. The ``who, when, what, where, and how'' need to be determined initially. . Data collection includes the preparation and administration of questions, acquiring the data, and follow-up activities with partners. . Analysis includes the determination of performance gaps, best practices identification, methods evaluation, and enablers identification. . Adaptation and improvement includes the publication of findings, improvement plan creation and execution of the implementation plan, which requires that benchmarking not end at the analysis phase. Benchmarking tools and models Some of the more popular gap analysis based techniques include the ``spider'' or ``radar'' diagram and the ``Z'' chart for gap analysis. These tools are very graphical in nature. The graphical approaches are easy to understand and capable of showing the multiple dimensions simultaneously, but it is still up to the analyst to integrate these elements into a complete picture. Another approach is the use of the analytic hierarchy process maturity matrix (Eyrich, 1991; Kleinhans et al., 1995) which utilizes a weighted scoring technique in the analysis of various benchmarks and provides a single score using perceptual values as set forth by decision makers. This multiattribute utility technique does provide managerial input and quantifies various measures, but a high degree of subjectivity still does occur. In addition, the various difficulties associated with the analytical hierarchy process still exist, e.g. rank-reversal. Statistical methods to analyze the data include regression and various descriptive statistics (Blumberg, 1994; Schefczyk, 1993; Moseng, 1995). Yet, even with the strong theoretical foundation of statistical tools such as multiple regression, a limitation occurs in the number of simultaneous inputs and outputs to consider (from a dependent/independent variable perspective) and that regression measures a correlation or central tendency, but not best practice. Regression does provide some identification of important relationships and gaps, but does not provide a direction for narrowing these gaps. Data envelopment analysis (DEA) is a mathematical programming technique used to evaluate the efficiencies of units and has been recently applied as a benchmarking tool innovation. Even though most of the applications of DEA may be considered as some form of benchmarking evaluation, the two techniques have not until recently been explicitly linked. We focus a little more on DEA for benchmarking since it is based on the use of ``efficient frontiers'' for evaluation of performance of units. The use of efficient frontiers is also used in this paper within a computational geometry perspective.

The literature has focused on DEA as applied and tested for external A computational benchmarking evaluation approaches. From a benchmarking methodology geometry viewpoint, it has been utilized for selection of partners for benchmarking by approach Collier and Storbeck (1993) in the telecommunications industry, and by Bell and Morey (1995) for travel management within various organizations. Collier and Storbeck used the standard DEA approach that calculated ``technical'' 213 efficiencies for determining benchmarking partners. Bell and Morey apply the allocative DEA (ADEA) approach for selection of benchmarking partners. The emphasis of ADEA application is on the identification of appropriate benchmarking partners that use a different mix of resources that are more cost efficient than that used by the firm under evaluation. Additional work on external benchmarking and DEA is in use for benchmarking the banking and finance industry by Barr and Seiford (1994) and the grocery industry by Athanassopoulos and Ballantine (1995). In these works they show DEA can be used effectively as a performance analysis tool, similar to the traditional gap analysis or ratio analysis used in benchmarking. Schefcyzk (1993), Sherman and Ladino (1995), and Sarkis and Talluri (1996) addressed the issues of internal benchmarking and DEA. Shefcyzk showed that for internal benchmarking, traditional ratio approaches significantly correlate with simple DEA models. They stipulated that this was the case since the cost structures internally were more consistent. One of the limitations of Schefcyzk's analysis is that a relatively small number of inputs and outputs as well as DMUs were considered. Athanassopoulos and Ballantine (1995) have argued that the use of ratio analysis in itself is insufficient for assessing performance, and that more advanced tools like DEA should be used to complement ratio analysis. Sherman and Ladino evaluated the internal operations of a banking organization. They looked at 33 operations and found over $6 million in savings, while maintaining the same level of quality service. Sarkis and Talluri (1996) applied cross efficiency approaches and evaluated processes at various levels of benchmarking unit analysis. They evaluated processes with corresponding aggregate and disaggregate levels of analysis. A recent interactive DEA based model incorporating goal programming has been introduced by Post and Spronk (1999). Instead of using DEA a more simple approach, that can be more easily grasped by management, is presented here. The uniqueness of this paper over existing benchmarking methods is that it effectively considers multiple dimensions and provides the least amount of improvements required in all or some of the dimensions in becoming a nondominated process. It also proposes mathematical programming models that optimally allocate resources in order to attain these improvements. Our methods overcome the limitations of traditional approaches by simultaneously integrating several critical variables and computationally solving for the necessary improvements across multiple dimensions. Although the efficacy of the DEA models for benchmarking comes closest to the methods proposed in this paper, the primary difference between this method and DEA is that it is output oriented and does not require information on the process inputs. This

IJOPM 21,1/2

214

feature can be considered as a relative advantage because it results in reduced data gathering efforts, which in turn has direct effect on cost and time resources. Computational geometric benchmarking analysis Computational geometry involves the study of efficient algorithms to solve geometric problems. The methodologies of computational geometry have numerous applications in areas such as astronomy, geographic information systems, CAD/CAM, data mining, graph drawing, graphics, medical imaging, metrology, molecular modeling, robotics, signal processing, textile layout, typography, video games, computer vision, etc. For more information, please see O'Rourke (1998) and Goodman and O'Rourke (1997). In this paper we illustrate a unique application of computation geometry for benchmarking purposes. In the next couple of sections we develop some geometrically based equations that will help analyze benchmarking data. The two-dimensional and three-dimensional cases are shown. Each dimension, as discussed below, will represent a performance measure. The benchmarking results are then introduced into mathematical goal programming formulations that will allow managers to determine optimal paths for improving their organizational performance. A generic framework is presented at the end of this section that generalizes the development approach to n-dimensional analysis. Two-dimensional benchmarking process This section demonstrates the generalized two-dimensional benchmarking process, which is extended to the three-dimensional case in the next section. Figure 1 depicts six manufacturing processes (even though the selected level of analysis is ``manufacturing'' process, the tool can be used for any unit of analysis for evaluation, as long as the performance measures are the same) rated against two performance measures represented on X1 and X2 axes. Example performance measures can range from throughput rates to quality levels of the processes. We assume that large values for the selected measures represent higher level of performance. For those variables where smaller values are preferred to larger values a scale transformation can be performed. The performance of the six manufacturing processes is represented by the set of points L (p1, q1), M (p2, q2), N (p3, q3), A (p4, q4), B (p5, q5), C (p6, q6). Since processes A, B, and C are dominated, to envelop them in the convex hull we introduce three dummy points O (0,0), L1 (0, q1), and N1 (p3, 0). We use q1 and p3 because they are the largest values in the respective axes. A convex hull for these points can be generated using existing hull generating algorithms (Graham, 1972). It is evident that processes L, M, and N lie on the frontier and are considered to be non-dominated processes, and processes A, B, and C are dominated because of their envelopment within the hull. The description of the benchmarking process is specifically targeted to these dominated processes. The benchmarking procedure for process B is illustrated. It can be easily generalized for other dominated processes. In order for process B to improve its operation the smallest increments in its measures occur by moving to

A computational geometry approach 215

Figure 1. Two-dimensional benchmarking

1

either point BT or BT2, which are targets obtained by perpendicular projections of point B on to lines NN1 and MN, respectively. One of these points results in the minimum increments required to reach the frontier. We also assume that these improvements will in fact require minimum process reconfiguration and therefore are economically feasible and effective. We determine the perpendicular distances and the projected points as demonstrated below. Although other perpendiculars exist from point B to line extensions such as to line L1L, we do not consider these distances because they are outside the frontier and so are considered to be inappropriate benchmarks. Initially we compare the perpendicular distances from point B to MN and NN1 respectively. For instance the distance BBT2 is obtained as shown in (1): …p † ‡ …q † ‡ 5 5 p BBT2 ˆ …1† 2 ‡ 2 x1 ‡ x2 ‡ ˆ 0 is the equation of line MN, and is derived, as shown below. Let m1 be the slope of line MN, we know that: q3 ÿ q2 …2† m1 ˆ p3 ÿ p2 Therefore line MN is expressed by:

    q3 ÿ q2 p2 q3 ÿ p2 q2 x1 ÿ q2 ÿ ˆ 0: x2 ÿ p3 ÿ p2 p3 ÿ p2

IJOPM 21,1/2

216

…3†

If BBT2 < BBT1 then BT2 can be used as the benchmark for B, if the converse is true then BT1 will be utilized as the benchmark. We assume that BBT2 < BBT1 and demonstrate the projection of point B on to line MN. Let m2 be the slope of BBT2. Since MN and BBT2 are perpendicular to each other, we know that: ÿ1 : …4† m1 m2 ˆ ÿ1 ) m2 ˆ m1 Therefore line BBT2 can be expressed by:     p3 ÿ p2 p5 p3 ÿ p5 p2 x1 ÿ q5 ‡ ˆ 0: x2 ‡ q3 ÿ q2 q3 ÿ q2

…5†

Equations (3) and (5) can be solved for point BT2 (p00 5, q00 5). This analysis provides managers with valuable information with respect to the minimum amounts by which the variable values must be increased in order to reach the frontier, i.e. (p00 5 ± p5, q00 5 ± q5). In the event that it is too expensive for managers to make this transition, they can consider an incremental approach for improving the performance of the process. The following linear goal programming formulation (6) allows for optimally selecting the improvement strategy within organizational constraints for process B when BBT2 < BBT1. min d1ÿ ‡ d2ÿ s:t

c1 …x1 ÿ p5 † ‡ c2 …x2 ÿ q5 †  b x1 ‡ d1ÿ ˆ p}5 x2 ‡ d2ÿ ˆ q}5     p3 ÿ p2 p5 p3 ÿ p5 p2 x1 ÿ q5 ‡ ˆ0 x2 ‡ q3 ÿ q2 q3 ÿ q2

…6†

Where x1 and x2 are the amounts p5 and q5 need to be increased to in terms of performance improvement; c1 and c2 are the costs associated per unit improvement of x1 and x2, respectively; b is the available budget for process improvement; d1± and d2± are the deviation variables; the last constraint represents the equation of the projection line. If BBT1 < BBT2 then the problem will actually be simplified because NN1 is perpendicular to X1 axis with its equation x1 = p3. Thus, the above method, demonstrated for obtaining the equation of MN (equations 2-5), is not utilized for NN1. A similar procedure can be utilized to identify benchmarks for processes A and C. For process A, optimal benchmarks can be chosen from AT1, AT2, AT3, and AT4, whereas process C must utilize CT1.

Three-dimensional benchmarking A computational For the three-dimensional case we follow a similar approach explained in the geometry previous section. Figure 2 illustrates three processes A, B, and C rated across approach three performance measures represented on X1, X2, and X3 axes. The performance of the three processes is depicted by the set of points A (p1, q1, r1), B (p2, q2, r2), C (p3, q3, r3). We use the three original points and the nine dummy 217 points O (0, 0, 0), A1 (0, q1, r2), A2 (0, q1, 0), A3 (p1, q1, 0), A4 (p1, q1, r2), B1 (p2, 0, r2), B2 (p2, 0, 0), B3 (p2, q2, 0), and D (0, 0, r2) to construct a three-dimensional convex hull. Since process C lies inside the hull it must use one of the targets from CT1, CT2, CT3, and CT4 as possible directions for improvement. The benchmarking process is explained in the following paragraphs. Initially, we identify the shortest (perpendicular) distance from C to each of the four planes A1A4BB1D, BB1B2B3, A4BB3A3, and A1A2A3A4. For instance distance CCT3 is given by (7): …p † ‡ …q † ‡ …r † ‡  3 3 3 T3 p CC ˆ …7† 2 2 2 ‡ ‡ Where x1 + x2 + x3 +  = 0 is the equation of plane A4BB3A3, and can be obtained by using the following method.

Figure 2. Three-dimensional benchmarking

IJOPM 21,1/2

218

Given three points A4 (p1, q1, r2), B (p2, q2, r2), and B3 (p2, q2, 0) the equation of the plane through these points is given by the determinants: 1 q1 r2 p1 1 r2 p1 q 1 1 p1 q1 r 2 ˆ 1 q2 r2 ˆ p2 1 r2 ˆ p2 q2 1  ˆ p2 q2 r2 1 q2 0 p2 1 0 p2 q 2 1 p2 q2 0 After identifying the shortest distance, we project point C to the corresponding plane in order to obtain the minimum improvements required in the three dimensions to reach the frontier. We illustrate the projection method assuming that the distance CCT3 is the minimum, which makes CT3 as the benchmark for C. Since CCT3 is orthogonal to plane A4BB3A3, we know by definition that a directional vector of the line CCT3 is a normal vector to the plane A4BB3A3. The directional numbers of line CCT3 with C (p3, q3, r3) and CT3 (p00 ; q00 ; r00 ) are given by (p`` ÿ p3 ; q00 ÿ q3 ; r} ÿ r3 ). Since the equation of plane A4BB3A3 is x1 + x2 + x3 +  = 0, the normal vector is given by ( ; ; ). By definition: ‡ p3 …8† …p} ÿ p3 † ˆ ) p} ˆ 

…q} ÿ q3 † ˆ ) q} ˆ

‡ q3 

…9†

…r} ÿ r3 † ˆ ) r} ˆ

‡ r3 : 

…10†

Where:  is a constant. Since (p}; q}; r}) lies on plane x1 + x2 + x3 +  = 0 it must satisfy the plane equation. Therefore:       ‡ p3 ‡ q3

‡ r3 ‡ ‡ ‡  ˆ 0: …11†    The value of  can be determined from the above equation, which is in turn used to determine the projection of C on to plane A4BB3A3 from equations (8) through (10). This identifies the point CT3 (p}; q}; r}). So the minimum changes required for process C to get to the frontier are (p} ÿ p3 ; q} ÿ q3 ; r} ÿ r3 ). In terms of the resource allocation the following linear goal program can be utilized to select the optimal strategy:

A computational geometry approach

min d1ÿ ‡ d2ÿ ‡ d3ÿ s:t: c1 …x1 ÿ p3 † ‡ c2 …x2 ÿ q3 † ‡ c3 …x3 ÿ r3 †  b x1 ‡ d1ÿ ˆ p} x2 ‡ d2ÿ ˆ q} x3 ‡ d3ÿ ˆ r}   p} ÿ p3 …x1 ÿ p3 † ÿ …x2 ÿ q3 † ˆ 0 q} ÿ q3   q} ÿ q3 …x3 ÿ r3 † ˆ 0 …x2 ÿ q3 † ÿ r} ÿ r3   p} ÿ p3 …x1 ÿ p3 † ÿ …x3 ÿ r3 † ˆ 0: r} ÿ r3

…12†

Where x1, x2, and x3 are the amounts to which p3, q3 and r3 need to be increased to in terms of performance improvement; c1, c2 and c3 are the costs associated per unit improvement of x1, x2 and x3, respectively; b is the available budget for process improvement; d1±, d2±, and d3± are the deviation variables; the last three constraints represent the parametric equations of the projection line in space. N-dimensional benchmarking Although the n-dimensional case can prove to be more challenging, it is an extension to the two and three-dimensional cases presented before and involves the same sequence of steps. Figure 3 provides a flow chart that explains the procedure for the n-dimensional case. Illustrative example In this section we provide an illustrative application of the proposed benchmarking process for the two-dimensional case. Consider a situation where three manufacturing processes A, B, and C, which are being evaluated on two performance measures yield rates (per cent) and throughput rates (units/time). The yield rates for processes A, B, and C are assumed to be 100, 80, and 70, respectively, and the throughput rates are assumed to be 50, 110, and 60, respectively. The convex hull with the three dummy points (A1, B1 and O) is represented by A1ABB1O, which is shown in Figure 4. Since point C is enveloped in the hull it is considered to be the dominated process, which requires benchmarks for improvement. Initially, we evaluate the equation of line AB, which is given by x1 + 3x2 = 350. Similarly, line BB1 is given by x1 = 110. The perpendicular distances CCT1 and CCT2 are evaluated using equation (1), which are 25.3 and 50, respectively. Since CCT1 < CCT2, we use CT1 as the benchmark for process C. The equation of line CCjT1 is identified to be 3x1 ± x2 = 110. The line equations AB and CCT1 are solved to obtain the point of intersection CT1(68, 94). Therefore, the targets

219

IJOPM 21,1/2

220

Figure 3. N-dimensional benchmarking process

Figure 4. Two-dimensional illustrative example

that process C must achieve in yield and throughput rates are 68 and 94, respectively. Assuming that the costs associated in achieving a percentage increase in yield rate and a unit increase in throughput rate are $1,500 and $2,000, respectively, and the allowable budget is $50,000, we can use the following goal programming problem to identify the optimal budget allocation strategy.

min d1ÿ ‡ d2ÿ s:t 1500…x1 ÿ 60† ‡ 2000…x2 ÿ 70†  50000 x1 ‡ d1ÿ ˆ 68 x2 ‡ d2ÿ ˆ 94 3x1 ÿ x2 ˆ 110

x1 ; x2 ; d1ÿ ; d2ÿ  0: The solution to the above problem identified maximum improvements of 6.66 units in throughput rate and 20 per cent in yield rate. These are the values that can be achieved within the allowable budget while providing the organization with the least distance to achieving the benchmarks. Process C should therefore utilize an incremental approach in achieving the final target values of 68 and 94 in yield and throughput rates, respectively. Conclusions In this paper we have proposed a benchmarking method based on computational geometry and goal programming. We have developed the model for two and three-dimensional benchmarking cases and addressed the procedure for the ndimensional case. We have provided an illustrative example that details the application of the model. The proposed models effectively identify benchmarks for poorly performing processes. These models also provide management with an optimal strategy for nearing benchmark targets. Even though this technique considered manufacturing processes, any process, department, function, or generic unit can be analyzed. As long as there are managerial accepted benchmark measures, the application of the technique is quite similar. As with any conceptual analytical model, the models presented here need to be evaluated in actual industrial settings. Even though the models presented use only two or three performance measures, in some cases this may be an appropriate level of complexity. Many practitioners and researchers support the idea that simplicity and ease-of-use be considered for performance measures and measurement systems (Maskell, 1991; Sheridan, 1993; DeToro, 1995; Blossom and Bradley, 1998). This simplification may mean limiting the number of performance measures that should be used (Blossom and Bradley, 1998). Lower levels of operational analysis (such as system or process) would tend to use fewer measures than more strategic units of analysis (plant or division). But even at the higher levels of analysis, aggregation still limits the number of measures used for performance analysis and benchmarking. Given this argument that two or three performance measures may be adequate, a still useful extension for this work will be the development of the benchmarking methods for the n-dimensional case. This extension can prove to be more challenging from an analytical perspective. The methodology for gap analysis presented here only requires that simple linear programming solver be available. It can provide substantial insight into directions for improvement of operations that an organization can pursue to help them maintain a competitive advantage.

A computational geometry approach 221

IJOPM 21,1/2

222

References Athanassopoulos, A.D. and Ballantine, J.A. (1995), ``Ratio and frontier analysis for assessing corporate performance: evidence from the grocery industry in the UK'', Journal of the Operational Research Society, Vol. 46 No. 4, pp. 427-40. Barr, R.S. and Seiford L.M. (1994), ``Benchmarking with data envelopment analysis'', presented at the October, 1994, ORSA/TIMS Joint National Meeting, Detroit, MI. Bell, R.A. and Morey, R.C. (1995), ``Increasing the efficiency of corporate travel management through macro benchmarking'', Journal of Travel Research, Vol. 33 No. 3, pp. 11-20. Blossom, A.P. and Bradley, J.R. (1998), ``Nine mistakes managers make in using performance measures to motivate employees'', working paper, S.C. Johnson Graduate School of Management, Cornell University, Ithaca, NY. Blumberg, D.F. (1994), ``Strategic benchmarking of service and logistic support operations'', Journal of Business Logistics, Vol. 15 No. 2, pp. 89-119. Camp, R.C. (1989), Benchmarking: The Search for Industry Best Practices that Lead to Superior Performance, ASQC Quality Press, Milwaukee, WI. Camp, R.C. (1995), Business Process Benchmarking: Finding and Implementing Best Practices, ASQC Quality Press, Milwaukee, WI. Collier, D.A. and Storbeck, J.E. (1993), ``A data envelopment approach to benchmarking in the telecommunications industry'', Ohio State Faculty of Management Science working paper. Ohio State University, Columbus. DeToro, I. (1995), ``The 10 pitfalls of benchmarking'', Quality Progress, Vol. 28 No. 1, pp. 61-3. Eyrich, H.G. (1991), ``Benchmarking to become the best of breed'', Manufacturing Systems, Vol. 9 No. 4, pp. 40-7. Goodman, J.E. and O'Rourke, J. (1997), Handbook of Discrete and Computational Geometry, CRC Press LLC. Graham, R.L. (1972), ``An efficient algorithm for determining the convex hull of a finite Planar set'', Information Processing Letters, Vol. 1, pp. 132-3. Heib, R. and Daneva, M. (1995), ``Benchmarks and benchmarking: a definitional analysis'', working paper, Institute for Information Systems, University of Saarlandes, Germany. Kleinhans, S., Merle, C. and Doumeingts, G. (1995), ``Determination of what to benchmark: a customer-oriented methodology'', in Rolstadas, A. (Ed.), Benchmarking Theory and Practice, Chapman & Hall, London, pp. 267-76. Maskell, B.H. (1991), Productivity Measurement for World Class Manufacturing, Productivity Press Inc., Cambridge, MA. Moseng, B. (1995), ``Productivity measurement: methods and tools developed in TOPP'', in Rolstadas, A. (Ed.), Benchmarking Theory and Practice, Chapman & Hall, London, pp. 248-60. O'Rourke, J. (1998), Computational Geometry in C, 2nd ed., Cambridge University Press, Cambridge. Post, T. and Spronk, J. (1999), ``Performance benchmarking using interactive data envelopment analysis'', European Journal of Operational Research, Vol. 115 No. 3. Sarkis, J. and Talluri, S. (1996), ``Efficiency valuation and internal benchmarking for business process improvement'', Journal of Engineering Valuation and Cost Analysis, Vol. 1 No. 1, pp. 43-54. Schefcyzk, M. (1993), ``Industrial benchmarking: a case study of performance analysis techniques'', International Journal of Production Economics, Vol. 32 No. 1, pp. 1-11. Sheridan, J.H. (1993), ``Where benchmarkers go wrong'', Industry Week, Vol. 242 No. 6, pp. 28-34. Sherman, H.D. and Ladino, G. (1995), ``Managing bank productivity using data envelopment analysis (DEA)'', Interfaces, Vol. 25 No. 2, pp. 60-73.