Cost Saving Breakthrough from Combined 6 Sigma

0 downloads 0 Views 148KB Size Report
Predetermined Time Standards (MTM, MOST and MODAPTS) for work ... system. With the time and human motion codes being standardized upfront, an IE/ ...
Cost Saving Breakthrough from Combined 6 Sigma & MOST Methodologies to set Production Standards Abstract ID: 135 Y. Sthong, Serene Choo A1 OPEX Industrial Engineering Abstract Predetermined Time Standards (MTM, MOST and MODAPTS) for work measurement development has been known for its reliability and consistency. However the downside is it takes too much of professional’s resource to apply it. At the normal rate, one experienced IE needs 8 hours to complete an analysis for a 30 minutes length of fairly complicated and non standard type of human motions. To analyze a 30-hour cycle time processes, 2-3 months of full working days (excluding the pre and post analysis activities) is required. This is way too resource consuming and expensive to engage IE to apply MOST for 100% work process to determine direct workforce level or finding Kaizen opportunities. This paper is based on the case studies of how to apply MOST in a heavy manual process EMS / Contract manufacturing environment. It discusses about applying 6-sigma methodology to determine the optimum sampling plan and criteria for MOST application. The combination of MOST sampling and full videotaping methods is proven setting statistically equivalent results compared to a 100% MOST analysis. This is a cost saving breakthrough that makes MOST analysis affordable and it widens the scope to benefit more organizations especially narrow margin industries to set standard.

Keywords Industrial engineering, time studies, standard cycle time, work measurement, predetermined time standards, human motions, manufacturing productivity, MOST, MTM, MODAPTS, 6 Sigma sampling.

1. Introduction Predetermined Time Standards development began since early 1930’s based on the studies and work of Frederick Taylor, Frank and Lillian Gilbreth, Henry Gantt and others who originated the concepts of scientific management for work measurement. Among the earliest ones were Method Time Analysis, followed by Work Factor. A decade later, Harold Maynard developed another system called Methods-Time Measurement, or it well known as MTM. In recent years, the so labeled as the “second generation” systems introduced included MSD (Master Standard Data), MTM-2, MTM-3 and MOST (Maynard Operation Sequence Technique). In 1966, the third generation system, MODAPTS (Modular Arrangement of Predetermined Time Standards) was published. Of all the systems in the market, the commonality of the systems is it is based on the concept of analyzing the human capabilities and hence determining the average rates at which task can be accomplished. In the detailed laboratory researches that focus on people at work, all the motions could be broken down and classified into small elements and categories. These elements were then being measured in the unit of measurement that is as fine as fraction of milliseconds. The unit of measurement for MOST is TMU (Time Measurement Unit) which one TMU = 0.036 sec whilst MODAPTS uses MOD for its single unit of time which one MOD = 0.129seconds. These time elements are the basic foundations of the predetermined time standards. As all of our body motions are the combination of multiple small motions in certain sequences, the time elements and the type of movement are all given specific codes. Table 1 shows code examples which are frequently used. Table 1: Examples of Codes in MOST and MODAPTS

Y.Sthong and S.Choo 2. The Applications of Predetermined Time Standards With the standardized and systematic classification of time elements that associated to the human motions, predetermined time standards system has the capabilities of measuring the work task time accurately as compared to stop watch time studies. Given a well defined and described process and work standard, this system will be able to measure the performance of a task systematically instead of subjectively as good, bad, slow, fast and etc. It will be a fair measurement towards the workers who carry out the task and likewise an (IE) Industrial Engineer/analyst will be able to determine a fair task time. The speed of work is no longer described as how fast, how slow and thus the process of the work can be evaluated objectively. As human speed is evaluated based on the basic human motion, the systems can be used widely not only in the factory but in other work environment as well; namely, warehouse, office operations, repair and maintenance, tool set up and etc. The managements or engineers who design the work process is able to segregate the ‘human elements and its speed’ from other machining time in their evaluation. One of the key differences of predetermined time standards system from the stop watch studies is the reporting system. With the time and human motion codes being standardized upfront, an IE/ analyst will assign the codes accordingly. This reporting process is systematic and objective for the process owner, the workers to know his/her performance and a manager to know the work quantity of the employees. The report not only serves as a good control document for performance measurement, but for best method documentation, benchmarking, re-engineering, costing, determining labor hiring model and setting future goals objectively. As for the IE or practitioners, a standardized and simple reporting system means ease of proliferation. By referring to the codes in the record, an experience IE even if he/she is not one performing the analysis, he/she is able to comprehend the analysis accurately. This is good for IE resource coverage and leveraging as well as spending less time for data recollection and analysis. Table 2 is an example of an analysis based on MOST method. Table 2: MOST Analysis Sample

3. The Process of Using the Predetermined Time Standards System Training and get certification on the predetermined time standard system is needed before engaging the selected program. However, the attendees should not be limited to only the IE or analyst, it is highly recommended for the managers, process engineering and supervisors who will be responsible for the job standard, capacity and manpower management and any other costing related management to understand the concepts and its method of deployment. In the actual workplace, a typical process flow for determining the standard time using MOST (or similar concept of Predetermined Time Standards System) is as shown in Chart1.

Chart 1: Predetermined Time Standards System Process Flow Chart

Y.Sthong and S.Choo 3.1 Understanding of Analysis Purpose Any type of the Predetermined Time Standards System is able to provide its analysis to meet various objectives. IE or analyst should seek understanding of the purpose and advice the analysis requestors about the expected outcome and set mutual agreement. Whether it is for what-if alternatives or to develop standard time, the IE/ analyst must be provided with the standard procedure so that there is a comparison of the actual versus developed task time based on a standard procedure, or so to speak the reference point. An expected duration for analysis completion should be stated upfront to ensure ample time for thorough analysis. 3.2 Data Collection MOST method analyzes the move that is a combination of fraction of milliseconds. With that details level, it is typically impossible for an IE / analyst to observe and be able to memorize the sequence of the body motions, document the motion descriptions and assign the codes according to the motions he/she just sees on the spot. Thus, normally a video recorder is used to record all the sequences so that it can be played back multiple times during the analysis. A tripod may be at help for long hour recording but on the other hands it limits the flexibility of taking the footages at various angles to enable the analyst to see the movement accurately. 3.2.1Outlier Screening and Identification The analyst should be sensitive to recognize if the footage taken is at the acceptable quality levels. The commonly observed (but not limited to) unacceptable scenarios are as follow and rectification/ action shall be taken: • The work performed by the workers is deviate from the process steps or the correct sequence compared to the documented standard of operating procedure • The actual inspections pattern or frequency is totally deviated or inconsistent from it is described. For e.g., for cycle#1- worker checks on 3 different locations; cycle#2- just a quick sweep, no details inspection; cycle #3 - the worker checks with extra caution for one of the inspection points. • The actual cleaning requirement on the raw material prior to assembly is too vague in the documented process or the part quality performance is inconsistent from cycle to cycle. For e.g., for cycle #1, only fingers are used to remove a visible dirt; cycle #2, it requires a piece of cloth to rub it once; cycle #3, a repeated rubbing • The operating tool (hammer, type of drill, lighting, work station set up) used in the actual process does not match with the specifications. • The workers are not familiar with the task, particularly on the sequence. Too much of the start-stop scenario prolongs the video screening time; likewise too long of the workers’ “thinking” time to recall the task sequence and, or to decide the inspection criteria or key points garble the facts of “human data processing” capability versus unfamiliarity of the given task. Analyst should seek clarification with the process owner on the standard to determine the actual task. If the process or quality performance is too inconsistent, this is the symptom that the process is not ready for task time development. However, if the data collection objective is to compare the cycle in consequence of different methods, the case studies or comparison criteria must be clearly defined. To avoid the “unclean” or outlier data point/cases as discussed in the raw data (the recorded video), the recorded video must be screened through warily to ensure the entire standard process is captured inclusive of all the motions for the task. An experience MOST analyst should be able to spot the abnormality in the video footage and decide whether to edit, retake the video and or seeking confirmation from the process owner regardless the objectives of the analysis . 3.3 Data Analysis & Data Validation 3.3.1 Data Analysis For a task which has no pre-work or analysis from any form of predetermined time standards system. Data analysis should start with document the steps of each task as details as possible. (See Table 3 for details); typically one line of task description and next to it are codes assigned. The key benefits of this very detail level of documentation are: it provides a quick and clear correlation of the motions versus codes that associate with each other. It is noticeably shorter time requires for future benchmarking and continuous re-engineering activities to understand and identify the profile of respective process steps and types of motions associate with. It also assists in setting goals more

Y.Sthong and S.Choo objectively; for example to reduce/automate/lower down the complex motions with high value codes by certain percentage. Table 3 Example of MOST Analysis Report

There are a few key points MOST or any predetermined time standard system practitioner should take extra caution. First and foremost, as the motion is broken down into fraction of second, it is impossible for an analyst to play the video at normal speed once and be able to observe, memorize and translate every single movement to the task description and subsequently analyze till the level which matches the motion codes. Secondly, majority of the motions are not as simple, straight forward and can be described explicitly as compared to those in classroom exercise. They are tricky but analyzable with time and effort to replay the video at the extra low speed repeatedly. Analyst shall ensure the readiness of video playing capability to support the analysis work. Simplify and summarize the movements will lead to inaccurate analysis and subject for tedious data validation*(see 3.3.2 for details) 3.3.2 Data Validation* MOST data analysis is not just a task to watch video and put the matching codes. The assigned codes should be practical. An experience IE is normally in alertness of the huge gap between the 2 comparing data sets of: • analysis results against the actual video captured task time or • significant gap of newly developed analysis results against the previous versions’ or • analysis results versus project development team estimation. The main causes of gaps are possibly 1) Overlooking the details 2) It is easy to assume certain motions can be achieved in a normal posture, distance, without obstacle and hence MOST codes with lower values are assigned. Rules of thumb to develop a practical and trust worthy MOST analysis are: • The analyst shall review the codes and validate with his/her own motions. • Never assume or have a pre-formed opinion that the worker in the video is too fast or too slow • Even if there is indeed waste in the existing processes/ steps. Do not predict that it is removable and thus assign a code for the ideal situation. Set that aside for improvement opportunities • Lastly, analyst to always validate though self-simulation of the assigned codes. This may be double the time of analysis, but it ensures a fair evaluation to the workers, and the organization truly gains its benefit of having a realistic work measurement. 3.4 Analysis Report The details documentation of each task during the analysis phase is normally tabulated in M.S Excel format. Analysis report is relatively easy, and the task time is flexible to be summarized to any preferred detail levels, namely by station, by operator, by the entire production line or operation area. For long task time activities, do ensure proper, systematic and organized naming convention is applied for the variables (some variables are main and sub-spreadsheet, video, and etc), for ease of future reference.

4. Problem Description Predetermined Time Standards System is at no doubt a tool which is highly systematic and organized in data and applications, and for that it has been recognized for fast estimation for simple, straight forward and repetitive motions; however many practitioners find the systems rather tedious and time consuming to apply in the real manual assembly manufacturing environment. A few of the commonly observed contributing factors are: During data collection stage: In additional to the outlier data as mentioned in section 3.2.1. • Unlike stop watch, video recording needs more set up preparation. Effort includes analyst’s collaboration with the operation owner for suitable schedule and qualified worker to avoid start-stop frequency and video retaking. • Particularly in a highly manual assembly process environment where workmanship plays significant role in the final product quality measures, details motions are normally not spelled out in the procedure. IE ends up

Y.Sthong and S.Choo



spending much time on referring and seeking clarification to comprehend the gap between analysis results versus expected estimated time at later stage. Process is not properly defined to the appropriate level. Bad examples are: check and clean the Part Y before insert to Location X. This is very vague and subject to the workers’ experience and judgment, preassembled /processed part quality performance and other hidden factors. Again, this induces wrong raw data and another round of video re-take or clarification on task requirement for the right codes.

During video screening and analysis stage: • If scenarios as mentioned in data collection stage above are not rectified, it is no different than passing “defective” data to the subsequent screening stage. IE will spend extra time not only to “cross correct” the human speed, but also to judge and segregate the “noises” in the data pool so that the analysis is as “pure and close” as possible to the standard procedure, and thus a fair standard task time to be established. • Not all types of motions are explicitly described and represented by MOST codes. Commonly found examples are excessive juggling, untangle the interlocking bundles of wires and cables, inspection detailed components at semi hidden location, and etc. A ‘tailored made’ code will be created to overcome the inadequacy of existing codes (time value, uniqueness or rarity of the type of motions). • Long cycle time and non repetitive work that performed by workers are tedious. Deciding a code for a task that is influenced by concentration level is a challenge for a MOST analyst. • Parts that will be assembled are not arranged per the specification. Extra time is needed for sorting, counting, checking, searching before assembled. All these actions are waste, but if these are analyzed through stop watch, it will be just categorized as such, but for MOST, analyst will still spend time for assigning codes accordingly • With the combination of all the contributing factors, it is never a surprise to find that up to 5 to 6 or more times to replay in extreme low speed in order to watch every single motion. • For a new IE, do factor in additional analysis time. It takes time and effort to memorize codes!

5. Sampling as Alternative Based on a case study, using MOST, an experienced IE needs 8 hours to complete an analysis for a 30 minutes length of fairly complicated and non standard type of human motions without sophisticated software. To analyze a 30-hour cycle time processes, 2-3 months of full working days (Time requirement is only for activities in section 3.3, excluding the pre and post analysis activities,) is required. This is way too resource consuming and expensive to engage IE to apply MOST to establish 100% task standard time. An optimum sampling plan is ideal for MOST application to balance the needs versus IE resource. To start a sampling plan, understanding correlation factors is the foremost step. 5.1 Understanding Correlation Factors As Predetermined Time Standards are based on the basic human motions, the deviations of repetitive motion by a same person under consistent condition should be insignificant. (For instance, it takes 10 seconds to walk X feet from point A to B, it will take approximately 10 seconds to walk from point B to A in same route). However, studies show that there are other external factors that influence the consistency of speed. By understanding the correlation factors helps to choose the rightful populations, sampling % and other parameters and assumptions. Below are some observed correlation factors contributing to speed inconsistency. • Simple and low value motion is not likely to have huge variance in speed. • Inadequate training and experienced worker tends to have big variance. • Machine driven/ dominant task time tends to have narrower variance gap. • Straight forward, short and consistent process from cycle to cycle contributes to small gap variance • Consistent raw parts/ pre-assembled parts quality contributes to stable process, and thus lower the variance gap. 5.2 Sampling Criteria Establishment To ensure a data set is usable as a sample to represent a population, it is recommended to use the criteria as listed in section 5.1 to serve as outlier data filtering factors. • “Stable & Consistency” from workers’ performance • All workers must be trained, certified, and have adequate practices. • “Stable & Consistency” equipment process and product quality performance

Y.Sthong and S.Choo Inconsistent or too frequent design or process changes will result to form fit that may need extra alignment, visual inspection, cleaning and re-process. Process/part/ tool performance as such is totally out from the sampling targets. • Unstable or incorrect product quality standard - Ambiguous inspection factors to be shouldered by workers are bad data points and should be totally eliminated. Include tasks from relatively short, medium & long cycle time to allow balancing effect. Samples are spread out throughout the entire process to avoid samples skew to any portion of the process. Data points should be based on acceptable quality and complete set of video footage. Even if all 5 points above are fulfilled, there is still one potential key factor to watch out; A task that requires “extraordinary skill with multiple fine and tricky fingers motions to position and assemble a part” will contribute to an inconsistent task time from worker to worker and cycle to cycle. This is because the end result of the work is influenced by thinking process and workmanship quality. If a station or a worker to handle task in scenario as such, a modified sampling plan is needed as summarized in the Table 4 •

• •

• •

5.3 Sampling Pattern There are a few types of possible sampling patterns and two are selected for discussion in this paper. As most of the factory or manufacturing line is set up either by functional groups or by product line, the sampling plan can be designed to follow either method or they can be combined as and when it is deemed suitable. • Method #1: Select X% of stations from the entire line for full MOST analysis for the selected substations, find out the estimator factor (see below for the equation) and use the factor for entire line. • Method #2: Sample every substation with Y% of the length of the task time for that particular station (or subset of a group of activities). Derive the estimator factor for each station and use it to estimate the station task time. The entire process task time is the sum of all the stations estimated task time Table 4 Sampling Method Computation Steps

5.4 Considerations for Sampling Pattern Selection For the selection of sampling patterns, instead of weighing their respective pros and cons, focus should be matching the task with the suitable type of sampling method. Table 4 has listed down the major influencing factors captured in sections 5.1 and 5.2. By following the table, this shall increase its effectiveness and benefits by avoiding taking in noises into the raw data when deciding the methods. Table 5: Consideration Factors to Select Type of Estimator Factors

Y.Sthong and S.Choo Method 1 is best used when it is extremely consistent. As shown in Table 5, this method is best used in the situation which is consistent in part and product quality with consistent predicted yield trend; the process and tools used by the workers are stable which means no rework or similar task to be carried out repeatedly; workers are also well trained and consistent in work performance and no special skills required for any part of the processes. Type of work that is suitable for method 1 is normally found in a straight forward and routine and simple assembly line and performed without much thinking or interacting with automatic guided tool for the work. For example: packing finished goods into a box; loading and unloading a standard size bundle of products from input and output loader. There is no high inspection or special skills to ensure workmanship of the products in the processes. If there is any factor not in this condition, method 2 should be chosen. Method 2 is preferred because the estimator factor is derived and to be used for each station. In other words, it is considered as a “localized” factor that is to normalize the respective gap that surfaced up at the particular station. Hence, localized factor is effective for whether process is stable or otherwise.

6. Win –Win Sampling Plans There is no hard and fast rule to indicate the suitability of sampling plan. Similar to any other studies using standard statistical applications, organization intends to apply sampled MOST analysis shall consult statistician to construct a sampling plan to match the needs of its confidence intervals. Despite industrial practice commonly follows the rule of thumb of 95% confidence interval to construct a sampling plan, for MOST sampling plan, it shall be carefully evaluated and modified to meet certain desirable properties and assumptions instead of strictly stick to 95% Confidence Interval (CI) throughout the product development phases. Refer to the Chart 2 below, case study shows the variance of video time versus MOST developed task time for each station in the production line. The flat bell curve as represented by black line is contributed by too many unknown noises in the data points contributed by an unstable process, quality performance and newly trained workers. To achieve a 95% CI without performing extraneous sorting, it will require as high as nearly 100% data points which is almost a full analysis than sampled studies. Contradictory, green curve represents the stations which demonstrate the behavior in the product mature stage. With the variability reduction contributed by the process, machines, product quality stability, the key prominent contributing factors are from human motion. Hence, the sampling percentage can be reduced and still to safely maintain the desired high CI based. Note: With the systematic reduction of variability contributed by the process, machines, tools and fixtures, product quality and learning curve from the workers, sampling rate / sample size should be revised to make the sampling plan practical and economical to model the population. It is not likely to maintain the sampling rate throughout the product development cycle and yet to achieve the same CI. For NPI (New Product Introduction) stage, sampling rate should be higher than the product mature stage. Sampling plan performed during product maturity stage is a better approach to provide a fair task time analysis and determination for workers. Chart 2: Relationship of Product Development Phase and Variance of Video Time vs MOST Time

7. Cost Savings Difference of geographical region, type of industry, and the availability of IE resource vary in large scales and so does the savings from the MOST sampling plan. Key prominent considering factors are normally the length of cycle time, IE resources, IE technical skills, revenue rate per worker, the environment that supports the studies, the use of data, duration that allocated for the analysis work, and more while deciding the best savings plans. When there is a confirmed constraint identified, modifying the sampling rate, and tweeting the sorting based on the guidelines provided. To objectively review the plan, consider a ROI (Return of Investment) to aid the decision making. As

Y.Sthong and S.Choo sampling plan is not crafted on rock, it can be modified anytime to meet the needs. Since the NPI (new product introduction) stage is full of what-if scenarios, stop watch combination with MOST is a more practical resource plan for quick and effective time estimation without sacrificing the accuracy . It is not necessary to only rely on one method for task time determination.

8. Summary With the labor cost is in rising trend everywhere globally, driving labor productivity breakthrough is no longer an option but inevitable. Effort in automation, making the process and system leaner and exploring cost saving new methods are seen to save resource, but setting the right standard for task time especially labor interaction time is often neglected. Without understanding the actual task time and the labor contributor will win only half the battle. Time study is an essential steps to understand the human capacity and next to set the right level of baseline, and create positive labor workforce with objective measurement on their work performance and subsequently setting realistic stretch goals. The combination of MOST sampling and full videotaping methods is proven setting statistically equivalent results compared to a 100% MOST analysis. IE is reminded to set the sampling plan with statistician, process owner and management to opt out an affordable sampling plan. This is a cost saving breakthrough that makes MOST analysis affordable and it widens the scope to benefit more organizations especially narrow margin industries to set standard and its continuous kaizen goals at the realistic steps.

References 1. 2. 3. 4. 5. 6. 7.

Douglas C. Montgomery, 1991, Introduction to Statistical Quality Control, 2nd Edition, John Wiley & Sons International MODAPTS Association, 2009, MODAPTS Modular Arrangement of Predetermined Time Standards Manual , 4th edition, PSDC Academy, Maynard Operations Sequenct Technique (MOST), Revision 2008, Course Content prepared by Asia Pacific Research Centre (APRC). Gavriel Salvendy, Handbook for Industrial Engineering: Technology and Operations Management, 3rd Edition, John Wiley & Sons Kjell B. Zandin, 1990, MOST Work Measurement Systems, 1992 2nd Edition, Revised and Expanded, Marcel Dekker, Inc George Kanawaty, 1992, Introduction To Work Study, 4th (Revised) Edition, International Labour Office Geneva Industrial Engineering Terminology, Revised Edition, A revision of ANSI Z94.0.1989, Engineering and Management Press, Institute of Industrial Engineers, Norcross, Georgia

Thank you!