document de travail 2006-027 - FSA ULaval

0 downloads 0 Views 447KB Size Report
Edición electrónica: Aline Guimont ... are the best parameters to use, i.e. inter-inspection times and threshold? (5) Is the ... speed and n is Taylor's parameter that is determined according to work and tool material used, tool ..... tool 14, the speed is increased all the way up to the machine limit, which was arbitrarily set to 1.2 ...
Publié par : Published by: Publicación de la:

Faculté des sciences de l’administration Université Laval Québec (Québec) Canada G1K 7P4 Tél. Ph. Tel. : (418) 656-3644 Télec. Fax : (418) 656-7047

Édition électronique : Electronic publishing: Edición electrónica:

Aline Guimont Vice-décanat - Recherche et affaires académiques Faculté des sciences de l’administration

Disponible sur Internet : Available on Internet Disponible por Internet :

http://rd.fsa.ulaval.ca/ctr_doc/default.asp [email protected]

DOCUMENT DE TRAVAIL 2006-027 SIMULATION OF RANDOM TOOL LIVES IN METAL CUTTING ON A FLEXIBLE MACHINE

Martin NOËL Bernard F. LAMOND Manbir S. SODHI

Version originale : Original manuscript: Version original:

ISBN – 2-89524-279-8

Série électronique mise à jour : On-line publication updated : Seria electrónica, puesta al dia

12-2006

Simulation of Random Tool Lives in Metal Cutting on a Flexible Machine MARTIN NOËL†*, BERNARD F. LAMOND‡ and MANBIR S. SODHI§ Abstract This paper describes some numerical experiments related to a tool management model for a flexible machine equipped with a tool magazine, variable cutting speed, and sensors to monitor tool wear, when tool life due to flank wear is stochastic. A computer simulation was performed where decision about tool loading and cutting speed were based on a deterministic mathematical programming model in which tool setup times are added up to total processing time whenever a tool is required but absent from the tool magazine. Two types of sensor systems are presented: offline sensors and online sensors. It is assumed that the sensor only gives information about whether or not the tool is in good condition to continue processing. The simulation aims at answering six questions: (1) Which statistical distributions should be used to simulate the life of a cutting tool? (2) How effective is a deterministic model if tool lives are stochastic? (3) How effective are the two sensor systems? (4) If the machine must be stopped to inspect tool conditions, what are the best parameters to use, i.e. inter-inspection times and threshold? (5) Is the use of a stochastic model still justified when tool life variability decreases (i.e. tool quality and reliability increases)? (6) Does adjusting cutting speed while processing a given part type help improve productivity? Keywords: Stochastic tool life; variable cutting speed; simulation; dynamic speed adjustment, machine tool. 1

Introduction

There is always great interest in using a higher level of automation in a manufacturing environment. Not only does automation improve productivity, but it also improves the quality of parts produced while reducing production costs. To be successful, firms also require dependable machine tool diagnostic schemes. Among other things, not being able



* Corresponding author. Department of Labour, Economics and Management, Télé-université, Université du Québec à Montréal, Québec, Canada G1K 9H5. Tel.: 1-418-657-2747 ext.5420, fax: 1(418)657-2094, Email: [email protected] ‡ Faculté des Sciences de l'Administration,Université Laval, Québec, Canada G1K 7P4. Tel.: 1(418)6562131 ext.5472, fax: 1(418)656-2624, Email: [email protected] § Industrial and Manufacturing Engineering, University of Rhode Island, Kingston, RI, USA 02881. Tel.: 1(401)874-5189, fax: 1(401)874-5540, Email: [email protected]

1

to monitor tool status is an obstacle to having an effective flexible manufacturing system since unsuspected tool breakage can add some significant delay in processing. There are mainly two types of approach used to measure tool wear: direct and indirect monitoring methods. Direct monitoring methods involve the observation of the cutting tool condition through optical measurement of wear, such as periodically observing changes in the tool profile, usually carried out by microscope. Though this method provides very accurate wear information, the obvious disadvantage is that it is an offline method since it requires the machining process to be stopped, creating delays, and is therefore limited in applicability (Colbaugh and Glass, 1995) unless inspection time can be done very quickly. Some other offline methods, such as touch trigger probes, are also time consuming and cannot determine the exact time of premature cutting edge failure. There also exist indirect monitoring methods of estimating tool wear. They are concerned with the correlation between tool condition and process variables such as surface finish, temperature, vibrations, cutting force, power and acoustic emission monitoring (Kurada and Bradley, 1997). Since it can be used in real time, an indirect method can be of great interest for flexible manufacturing systems. Cutting speed is one of the most important characteristics of tool management in metal cutting, which may vary a lot from one part type to another. In fact, cutting speeds can be as high as hundreds of feet per minute or as low as a few inches per minute and cutting may be continuous for several hours or interrupted in fractions of a second (Kaspi and Shabtay, 2003). The objective is to select the cutting speed so as to minimize expected processing time knowing that the conditions of operation can be quite variable and that this will have an important impact on the economical tool life. In that case, cutting speed must be selected in order to find the best tradeoff between cutting time and the number of tool setups needed. Indeed, higher cutting speed will imply shorter cutting time but more tools will be needed since the wear process is more important at higher speeds and vice versa for lower cutting speeds. Traditional methods for predicting tool wear are based on the pioneering work by Taylor (1907), which gives a convenient relationship between tool life and cutting speed. 2

Knowing the tool and part material used and the desired cutting speed, we can estimate the nominal tool life, tl, as follows: t l = t r (v / vr ) −1 / n

(1)

where v is the cutting speed, vr is a reference cutting speed, tr is the tool life at reference speed and n is Taylor’s parameter that is determined according to work and tool material used, tool geometry and cutting fluid. This equation gives a good indication of the expected tool life for various cutting conditions. Even though there exists more precise tool life equations, we used this formulation, which is still widely used today, because of its simplicity. This is justified since cutting speed is known to be the most significant factor in determining tool life. Many authors have studied randomness in tool lives (see, e.g., Wager and Barash, 1971, Ramalingam and Watson, 1977, Ramalingam, 1977, Colbaugh and Glass, 1995, Wiklund, 1998, Xie et al., 2005). However, when it comes to tool management (or more specifically assigning tools to machine’s magazines and selecting cutting speed), the problem can get quite complicated if tool lives are treated stochastically. In this paper, we intend to show through simulations that taking tool life deterministically instead of stochastically (as a random variable) can greatly mislead managers. Tool life can be estimated to follow different distributions, like gamma (Ramalingam, 1977, Wiklund, 1998), normal (Wager and Barash, 1971, Shabtay and Kaspi, 2003, Ramalingam, 1977, Rosetto and Zompi, 1981), lognormal (Wager and Barash, 1971, Ramalingam, 1977, Rosetto and Zompi, 1981, Wiklund, 1998), Weibull (Ramalingam and Watson, 1977, Liu et al., 2001, Wiklund, 1998) or exponential (Ramalingam and Watson, 1977). As a result, one goal of our simulation study is to compare the effect of various distributions of random tool life on the expected processing time. There are two aspects neglected by deterministic assumptions, besides the choice of a tool life distribution, that are of interest in our study. First, there is possible improvement in processing efficiency that can be achieved by adjusting cutting speed as new information becomes available. For example, if a tool breaks prematurely, adjusting the cutting speed to take into consideration this new information may improve productivity. 3

Second, we know from experimental results of Wager and Barash (1971) that the coefficient of variation, defined as ξ = σ/μ, where μ is the expected tool life and σ is its standard deviation, is practically constant at all cutting speed. In their work, variability was found to be fairly large with ξ ≈ 0.3. However, recent experimental work of Wang et al. (2001) indicates that some carbide tools show less variable tool life nowadays. Therefore, in this paper, the impact of two different values for the coefficient of variation on scheduling performance will also be compared. We used a computer simulation study to model cutting operations on a flexible machine with various parameters, tool types or part types. Simulation is commonly used to provide approximate answers quickly with a high level of accuracy (Ross, 2002). It can be used to easily try various configurations or to conduct a sensitivity analysis. In the present case, we used the simulation to answer six questions: 1. 2. 3. 4. 5. 6.

Many statistical distributions are suggested to simulate the life of a cutting tool, but do they really yield different results that can influence managers’ planning, and if so, which should be chosen? How effective is a deterministic model in a stochastic environment? How effective are the two types of sensor systems? If the machine must be stopped to inspect tool conditions, what are the best parameters to use, i.e. inter-inspection times and threshold? Is the use of a stochastic model still justified when tool life variability decreases? Does adjusting cutting speed while processing a given part type help improve productivity?

The first question involves comparing typical two-parameter statistical distributions for tool life: gamma, normal, lognormal and Weibull. Once tool lives are randomly generated from any of the previously mentioned distributions, we will be able to answer the second question. Since the first two questions are fairly simple to present, we will therefore concentrate, in Section 2, on detailing some of the particularities of the system that are necessary to answer Questions 3 to 6. In Section 3, the simulation model is presented. Results and analysis of various scenarios tested through the simulation are presented in Section 4. Concluding remarks are given in Section 5.

4

2

Specific modeling assumptions

In our model, it is assumed that machine tools have the capability to change cutting speed and to use an available tool on an as-needed basis. They are usually equipped with a tool magazine of finite capacity. When processing a part, the time required for switching between tools that have already been loaded in the magazine is negligible. However, if a needed tool is not available in the magazine, then the machine must be put offline and a new tool is placed in the magazine or tool holder to complete processing. For a given tool type, it is assumed that additional tool setup time is a known constant. Tool wear is assumed to be gradual and not the result of catastrophic breakage. Solutions to the simulated problems presented here are therefore a lower bound to a more realistic setting that would incorporate catastrophic failures involving extra time due to possible repair to the part or machine. The selection of an optimal cutting speed will therefore have to take into consideration the tool wear process and the physical limitations of the machine. In other words, the cutting process takes place under normal conditions, i.e. between a minimum and maximum speed. Tool sharing is not permitted between different part types. This is a legitimate assumption, for example if we assume that the part material is made of very hard metal or remaining tool life, which can be assumed marginal, will be used for roughing cut of other jobs. Finally, setup time required for preloading the magazine is assumed to be constant and therefore does not play a role in time minimization. The problem of selecting tool types that will be loaded into the magazine will take into consideration those assumptions for machining. In a deterministic environment (at least in theory), we can use Taylor’s formula to exactly determine a tool’s economic life according to the selected cutting speed. In that case, the focus is turned towards the operating level. Scheduling then becomes strictly a problem of selecting the cutting speed and assigning tools to the magazine. We used the work of Lamond and Sodhi (1997) as a basis for comparison where optimal cutting speed can easily be determined deterministically. Note that tool loading is done using marginal

5

analysis on processing time with a stack sort algorithm (see Denardo (1982) and Lamond and Sodhi (1997) for details). A manager usually has several machine tools that can influence the way part types are allocated into the system. Sodhi et al. (2001) present an extended version of the deterministic model in the case of multiple machine tools. However, here we do not look at the part type allocation problem on identical machine tools, but this could be done using the allocation heuristics developed in that paper and then simulate processing on each machine in order to determine the system’s total processing time. By contrast with the deterministic environment, the attention turns toward monitoring tool wear process when tool life is stochastic. This can be done using sensors, whether online or offline. In the absence of sensors, monitoring tool wear is under the responsibility of the operator. We will model the operator judgment with a simple heuristic rule. These tool monitoring methods will be presented next. 2.1

Monitoring tool wear

The simulation considers two types of tool wear monitoring system that will influence replacement policies. In the first system, the machine tool is equipped with online sensors generally using an indirect monitoring method. In the second system, the machine tool is equipped with offline sensors that usually detect tool wear directly. Those two types of sensors will be compared with the case where there are no sensors used. In this last case, it is up to the operator to decide when to change a tool. In each case, the tool magazine is loaded before starting production as proposed by the deterministic algorithm presented in Lamond and Sodhi (1997). Figure 1 illustrates how total processing time is obtained whether the environment is deterministic or stochastic and according to the sensor system in place. The table on the left of the figure shows the various parameters where d represents the distance to be traversed by the tool in order to complete the part and Θ is the setup time. These parameters are then used to derive optimal deterministic values (computed data), such as the number of tools needed, k*, the cutting speed, v* and the corresponding tool life, tl. The lower cell shows an example of 6

five possible random tool lives (the three tools, k*, justified through the deterministic model plus two extra tool lives if needed). The chart to the right of Figure 1 shows how processing time is computed using those values according to the tool wear monitoring used. This is then compared with the nominal tool life when the environment is assumed deterministic. The first rectangle – dark grey – in each row represents the amount of time the first tool was used, the following rectangle – light grey – the second tool and so on. The number of rectangles up to the thick black line – representing the cutting time needed to complete the part type at speed v* – indicates the number of tools needed. Details for all three tool wear monitoring methods will be presented in the next three subsections. Parameters

Computed data k* = 3 tools v* = 1 m/s tl = 200 sec

d = 600 m Θ = 100 sec n = 0.4 vr = 1 m/s tr = 200 sec Random tool lives = 138, 294 and 126 seconds Extra tool lives = 132 and 306 seconds

Figure 1. Example of a simulated part type. 2.1.1

Machine tool without sensors

When work is done using a machine tool without sensors, we make the modeling assumption that the operator, who is often responsible for supervising several machines, would use a simple rule of thumb such as an age replacement policy (Barlow and Proschan, 1965). Specifically, we assume the tool is replaced when cutting time reaches tl, or when a tool failure occurs before that fixed time tl. We think this rule is a reasonable representation of actual industrial practice when machines are not equipped with sensors. Recall that the processing time prescribed from the deterministic model is based on the assumption that each tool is used for an amount of time equal to tl. Therefore, as soon as a tool lasts less than tl, then machining requires one or more tools than in the deterministic case. These extra tools need to be loaded individually by the operator thus 7

incurring more setup time. Consequently, the expected processing time will be larger than the deterministic processing time. To illustrate our analysis, we use the parameters presented in Figure 1. If cutting time is 600 seconds (10 minutes) at the predetermined optimal deterministic cutting speed and three tools slots are available in the magazine, then each tool should last 200 seconds under the deterministic model. The simulation will then generate 3 tool lives, from a given distribution, say gamma, with a mean of 200 seconds and coefficient of variation of 0.3. From the first three randomly generated tool lives displayed in the table of Figure 1, we can note that the second tool will only be used for 200 seconds, totaling, with all three tools, 464 seconds of cutting time. Since 136 seconds remain, a fourth tool life of 132 seconds is generated which still does not suffice to finish the part. Indeed, 4 seconds remain and will be done with yet another extra tool. Note that the remaining economical tool life will probably be recycled and used for roughing other part types – represented by the hatched area in Figure 1. Hence, processing time using this policy is 600 seconds of cutting time plus two extra tool setups of 100 seconds each, totaling 800 seconds or 13 minutes 20 seconds. That’s an increase of 33% over the processing time computed from the deterministic model. 2.1.2

Machine tool with online sensors

When tool wear is continuously monitored and the sensors can give the exact time at which a tool is considered worn out (usually 0.3 mm for flank wear), this allows for a full usage of tool lives. This is obviously the best case scenario. In the example presented in Figure 1, machining with the three preloaded tools will only last 558 seconds instead of the 600 seconds expected originally. An additional tool is required, which yields a tool life of 132 seconds. Since only 42 seconds of processing is required, the actual machining will be done with one extra tool setup over the original deterministic strategy. Again, the hatched area in Figure 1 represents the economical tool life left after machining the part type. That’s a 17% increase in processing time over the deterministic case, i.e. (700600)/600×100%. Let’s note that, in a similar simulation, the first few tool lives could last long enough to avoid a planned tool setup. In this case, the total simulated processing 8

time would be less than what the deterministic case suggests. Unlike the case with no sensors, it is not obvious that the expected processing time will be larger than the deterministic processing time. We assume that the signal send by the online sensors allows for full usage of the economical tool lives, then we only need to simulate total tool life according to the statistical distribution as was done previously. 2.1.3

Machine tool with offline sensors

Now, let the machine be equipped with an offline sensor system. We assume that inspections are quick and only give the following information: the tool is in good condition or it is worn out. The sensor can be a touch trigger probe or a fixed camera that can be utilized to quickly analyze tool condition. Tool wear monitoring can be done at fixed intervals of time or when deemed necessary, with more inspections near the end of the tool life for example. Results from an inspection will dictate if the tool keeps processing or is replaced. For a tool to keep processing, it must respect two conditions. First, the tool is in good condition when inspected. Second, the probability of lasting until the next planned inspection is greater than a given threshold. Otherwise the tool must be replaced. This decision rule is presented in Figure 1 where we use an offline sensor system with inspections at fixed intervals – every minute of operation. Inspections are represented by white lines. For the first tool, the second check, after 120 seconds, reveals a tool in good condition but with a very low probability of processing for another full minute and so is replaced. The second tool is inspected four times, and each time is in sufficient condition to continue. From its last inspection, the tool failed after 54 seconds, for a total of 294 seconds of cutting and a delay of 6 seconds since the fifth inspection is set at 300 seconds of processing. After 120 seconds, the third tool was judged incapable of lasting another 60 seconds. Total cutting time for those three tools is 534 seconds. The extra tool used was sufficient to complete the job, leaving 54 seconds of possible cutting under the same conditions. Thus, if we suppose that it takes 5 seconds to perform a tool wear test, then total processing time, in this case, is 751 seconds, or 12 minutes 31 seconds, including 9

600 seconds of cutting time, one extra tool setup of 100 seconds, 9 inspections for 45 seconds and 6 seconds of delays. That’s an increase of 25% over processing time computed from the deterministic model. We now give some details pertaining to the inspection schedule for the off-line sensors case. When using an offline method to monitor tool wear, the inter-inspection times are set according to each part type parameters. Time between inspections for a given part type can be equal throughout the processing of the part type, or else it can be adjusted according to tool condition. Note that defining a unique inter-inspection time for all part types would not yield good results, since the mean useful life of the different tool types may vary quite a lot, going from less than a minute to sometimes more than 15 minutes. In order to determine a good sequence of inter-inspection times, we need to find the best tradeoff between inspection times and downtimes. Inspection time is equal to the number of inspections done multiplied by the time required to make an inspection. Downtime represents the amount of time during which the tool is processing while worn out. In that case, a new tool will have to rework from the start of the downtime, which we assume to be known when processing with the new tool. When the tool life distribution has an increasing failure rate (IFR), we expect that a decreasing sequence of inter-inspection times would be better than using a fixed interinspection time. There is an exception when tool life follows an exponential distribution, which is memoryless. In that case, inter-inspection time should be constant. We will simulate both constant and variable inter-inspection times when tool lives follow a gamma distribution (which is IFR when its shape parameter is greater than one). Constant inter-inspection times In the case where inter-inspection time is fixed while processing a given part type, two questions immediately arise: 1. 2.

What should be the time interval between two inspections? Under what conditions should a tool be replaced?

To answer these questions, we need to define some parameters of the model.

10

Let δ be the inter-inspection time in seconds and suppose δ > 0. Since we have no information on the exact condition of the tool when inspected, we resort to conditional probability to determine the probability that if the tool has lasted up to a certain time t, it will end its life within the next δ seconds. In other words, if X is a random variable representing tool life, then this probability is given by P(X < t+δ | X > t) = P(t < X < t+δ)/ P(X > t) = (F(t+δ) – F(t))/(1 – F(t)),

(2)

where F(t) is the cumulative distribution function. In the case where tool lives follow a gamma distribution with shape parameter r and scale parameter λ, we note that since the coefficient of variation ξ is fixed and the expected tool life tl can be computed from the cutting speed using Taylor’s formula, we can define the parameters of the gamma distribution as r = 1/ξ2 and λ = r/tl. From Equation (2), we need to define an interinspection time δ and a threshold ε, such that our decision rule will be to replace a tool either if the tool is worn out or in good condition but P(X < t+δ | X ≥ t) ≥ ε. To answer the two questions above, we need to determine δ and ε. Some choices are obviously better than others. On the one hand, it is easy to see that if the threshold is set too high and/or δ is too long, then chances are high that tools will fail while processing, requiring corrective maintenance, i.e. high downtime cost. On the other hand, if the threshold is set too low and/or δ is too short, there will probably be a high cost in preventive maintenance, i.e. high setup and inspection costs. Let G(x) be the cumulative distribution function of the gamma with an integer shape parameter r and let the inter-inspection time δ be a function of expected tool life, i.e. δ = atl for some real number a > 0, then equation (2) will be independent of tl. Now, let y be a positive integer, then at the time of the yth inspection, we have r −1

( yar ) j . j! j =0

H ( ya) = G ( yatl ) = G ( yar / λ ) = 1 − exp(− yar )∑

Let Py(a) be the conditional probability that a tool that is found in good condition at the yth inspection will end its life before the following inspection. Then, form equation (2), we have Py(a) = [H((y+1)a) – H(ya)]/[1 – H(ya)]. 11

Because the function Py(a) is non decreasing in y, then for a given threshold ε, there exists a integer y* such that Py*(a) ≥ ε and Py*-1(a) < ε, provided ε is not too large. For example, suppose r = 11 and a = 0.1, then after 50 inspections, H(5) is almost equal to one but P50(0.1) = [H(5.1) – H(5)]/[1 – H(5)] = 0.5943, which is close to the maximum possible value for the threshold. If a = 0.2, then the maximum value that ε can take is 0.8462. From this equation, we can see that a will be used to determine the interinspection time and that y will be used to specify the threshold ε. Thus, a and y are the values needed to answer the two questions. We resort to simulation to find those optimal values, which are presented in section 4. Variable inter-inspection times Since the exact tool status is never known, even when the tool is inspected, we cannot have an inspection schedule that would take into account tool condition. Instead, we will use information about tool life distribution and vary time between inspections in order to reduce inspection and delay costs. However, varying inter-inspection times in order to find an optimal schedule can become quite complicated. We present a predetermined inspection schedule, only based on tool life distribution. This ‘preset’ inspection schedule is useful if tool inspections can only determine whether the tool is considered good or worn out. Due to the infinitely many combinations, we needed to make some assumptions and simplifications. Assuming that tool life is gradually deteriorating, time between inspections will be nonincreasing as a given tool wears out. This assumption is realistic, since the probability density function of the system life time is log-concave and, in that case, Barlow et al. (1963) show that the optimal inspection intervals decrease with time. The normal, gamma (with shape parameter r ≥ 1) and Weibull (with shape parameter b ≥ 1) distributions are all examples of log-concave distributions. Inter-inspection time is assumed to be a function of tool life, i.e. the time between inspection i – 1 and i will be defined as ai × tl, for some values of ai ∈ {0.1, 0.2, 0.3…}. Recall that tl represents nominal tool life. Therefore, a1 ≥ a2 ≥ … ≥ ai ≥ ai+1 ≥ … Suggested values for ai and threshold ε were obtained through simulation and are presented in section 4. 12

From section 2.1, we can expect that improving tool monitoring can be a great asset for higher throughput. Conversely, increasing tool quality can also improve tool usage and predictability of tool breakage, hence improving scheduling. 2.2

Influence of tool quality

The quality of tools plays a major role in the productivity of a flexible manufacturing system (FMS). According to Sandvik Coromant (2000), a 20% increase in metal cutting removal reduces the total cost per component by 15%. To attain such performance, high quality cutting tools are essential. Better quality translates into better reliability in the tool’s useful life, which in turn diminishes the variance. The coefficient of variation is known to remain almost constant regardless of the part and tool material used. Wagner and Barash (1971) suggested a coefficient of 0.3. However, tool suppliers are continually working on improving the quality of tools, hoping to make them more durable and reliable, which would result in longer mean tool life and/or reduced variance. This would result in lower values of the coefficient of variation of tool lives (as the coefficient of variation nears zero, the model becomes more deterministic, hence more predictable). More recently, Wang et al. (2001) have studied reliability of carbide cutting tools. They showed that the coefficient of variation is approximately constant at 0.1. To test the effect of reduced tool life variation in the simulation, additional runs have been conducted with that value of the coefficient of variation to take this situation into consideration. Even with good machines and tools, results can be very disappointing if no efforts are made to improve process planning. Due to the stochastic nature of machining, continuous quality improvement can sensibly reduce processing time and unexpected scheduling problems. This will be addressed in the following section. 2.3

Online adjustment strategies

Sections 2.1 and 2.2 looked at ways to increase productivity by either improving the machine tool’s monitoring system or by upgrading tool quality. However, altering the schedule and adapting it to the new status and production requirements, called dynamic 13

scheduling, can help increase output without investing in equipment. Dynamic scheduling can be done by either regeneration or net change. ‘Regeneration aims to produce a new schedule covering all unstarted operations while net change only produces a schedule for part of operations’ (Song et al., 2003). In this section, we consider improving production planning for a given part type by reacting to early or late tool breakage. The interest turns to possibly reducing cutting time but mostly avoiding unnecessary and costly extra tool setups. We opted for a dynamic strategy that revises processing speed each time a tool change takes place. When this occurs, we use the deterministic algorithm presented by Lamond and Sodhi (1997) to re-optimize the cutting process. If we let tˆi represent a random tool

life and Ti be the number of tools loaded into the magazine for the processing of a given part type i, then the following updates must be made each time a tool is consumed and the part type is not yet finished, i.e. di > 0: di = (di/vi* – tˆi ) × vi*; Ti = max(0, Ti – 1). Here, d/v represents cutting time at cutting speed v. If tool life is generated from a gamma distribution while using the optimal deterministic cutting speed and a coefficient of variation of 0.3, we will notice that there is only a 46% chance that the last planned tool will actually last its expected tli seconds†. So, more than half of the time, the last planned tool breaks while leaving little work to process, yielding a costly tool setup. Again with the new tool, re-optimizing using the deterministic model will suggest a faster cutting speed with only a 46% probability of completing the task. Without the physical limits of the system (maximum cutting speed vu), we can expect that speed will continuously increase and that 2.17 extra setups on average will occur as soon as there is only one tool planned‡. Depending on machining capabilities, there can be occasions when many extra



Since cutting speed is always adjusted, tool life follows a Gamma(11.1, 11.1/tl) and P(x ≥ tl) = .46 whenever re-optimizing yields k* = 1. If we use the standardized form of the gamma, we have a Gamma(11.1, 1) and P(x ≥ tl(11.1/tl)). In other words, the probability of completing the part type is only dependent of r. ‡ From a geometric distribution with parameter p = 0.46.

14

tool setups will be needed and thus reduce the effectiveness of the online adjustment process. This would obviously yield a much higher total processing time. However, we can use a lower cutting speed than prescribed by Taylor’s formula for the last planned tools. It is a realistic assumption, since in the industry the data are about 20% to 50% lower than the recommended economical values (Wiklund, 1998). Figure 2 illustrates an example of an online adjustment as was considered in our experiments. Fourteen tools are used to process a given part type. For the first tool, the cutting speed is set at the optimal deterministic cutting speed with an expected tool life of tl seconds. Afterward, cutting velocity is computed according to the work remaining on the part type, again using the deterministic model. We can notice that the third and eleventh tools have a speed set below the original optimal cutting speed. In that case, a longer tool life is expected, which could help save a tool setup. Also note that if the random tool life is close to what is expected, cutting speed will not vary (see tools 6 and 7). Finally, when very little work remains, cutting speed tends to increase. In the case of tool 14, the speed is increased all the way up to the machine limit, which was arbitrarily set to 1.2 m/s. This may not be optimal economically to burn a tool with very high speed. However, this will help reach our goal to minimize total processing time. In general, variation in tool life will have a greater impact when few tools remain with a tendency toward higher cutting speed. For example, 10 seconds variation when there is 1200 seconds of work left will have little impact on total processing time since other tools can adjust for time lost or time saved. However, when only 100 seconds remain, 10 seconds can mean setting up a new tool. To avoid those costly extra tool setups, cutting speed will be reduced whenever k* = 1. The next section will detail the simulation model.

15

Figure 2. Illustration of online adjustments. 3

Simulation Model

As discussed earlier, the objective of the simulation is to answer six questions concerning the impact of a deterministic model in a stochastic environment (including various values of the coefficient of variation); the importance of tool life distribution; the influence of various tool wear monitoring systems; the optimal schedule when using offline sensors; and the possible improvement gained by dynamically updating parameters. Recall that a flexible machine with a tool magazine of capacity χ is used to process a number I of part type. We assume each part requires its own tool type, which cannot be used to process a different part type. Up to χ tools (including duplicates) if any can be preloaded in the tool magazine and for which the setup time is considered negligible when they are used for cutting. We use a tool loading policy that is optimal for deterministic tool life, as in Lamond and Sodhi (1997). The computed processing time required to produce a set of part types with this tool configuration is noted. The 16

simulation then uses the same parameters plus a coefficient of variation to generate random tool lives which are used to calculate the stochastic processing time, including cutting, tool setup and, in the offline sensor case, inspection and delay times. The difference in percentage between deterministic and simulated processing times is subsequently reported and plotted. The first step in our simulation is to determine the size of the problem. The inputs necessary for computing are the number of parts and the tool magazine capacity. Three batch sizes are used (number of parts, I = 5, 10 and 20). Six ratios of tool magazine capacity over the batch size (χ/I) starting from a very tight (no tool magazine → ratio = 0) to a much looser ratio (5 times the number of tool slots as there are part types → ratio = 5). The simulation model is presented in Figure 3. It is a simplified version showing the initialization process before determining the optimal deterministic tool loading and selecting the type of sensor used. The data are randomly generated from uniform distributions with limits that roughly correspond to the physically permissible range for each of the parameters of interest (Sodhi et al., 2001). Those parameters concerning the task and tool characteristics required for simulating the processing of a given part type were generated as seen in Table 1. For the no-sensor and online sensor cases, we simply need to simulate the economic tool lives used and count the number of tools needed to process a set of part types. For the offline sensor case, we also need to count, for each part type, the number of inspections and compute delay time or downtime. Total processing time is then computed and compared with the deterministic case. Note that for the particular range of parameter values we used, average tool lives have rather small values, making any extra tool setup quite significant on total processing time. This is a legitimate assumption if part types are made of very hard metal, like stainless steel or titanium, in which case high speed cutting will deteriorate the tool quickly. Once the parameters are set for a given run, we can compute the total processing time for the deterministic case, based on Lamond and Sodhi’s work (1997), which selects, for each part type, the optimal cutting speed and the optimal number of tools to use and to preload into the magazine. 17

Parameters Setup time Reference speed Amount of work Taylor’s parameter Tool life constant

Distribution for randomness Θi ~ U[100 sec, 500 sec] vr = 1 m/s di ~ U[400 m, 600 m] ni ~ U[0.1, 0.7] Ci ~ U[2, 7], where Ci = vr t rini Table 1. Distribution used for generating test problems. Init. Select batch size, χ, ξ, tool and part parameters. Determine optimal deterministic processing time, tool = 0

Simulate tool life (≤tl) tool = tool + 1 No

Offline sensors

Type of sensor

No sensor

Part finished

Online sensors Simulate tool life tool = tool + 1

No

Yes

Simulate tool life Compute number of inspections tool = tool + 1

Part finished Yes

Compute P

Compute P No

Part finished

If corrective change, compute downtime No

Yes Compute P

Batch finished Yes Compute percentage increase

Figure 3. Diagram for processing part types according to the type of sensor used. Now, in the simulation, we are interested in the case where the tool life is stochastic rather than deterministic. Since in a stochastic environment, the length of time a tool will last is not known with certainty, tool lives are randomly generated from a given distribution with mean tl and coefficient of variation ξ. The tool life in excess of tl will be accounted for if sensors permit. As long as the part type is not completely processed, new tools are used, which sometimes implies setting up extra tools. For a static system, the same speed and mean tool life are used throughout the production of the given part type. 18

For dynamic online adjustment, the new optimal speed v* is computed along with the corresponding mean tool life, tl every time a new tool is used. When tl gets very small, the optimal speed will increase up to vu. Those data are then used to generate new random tool life duration and the simulation program will loop until the part type is finished. The next section reports results obtained from the simulation for every strategy used: standard machine tool, machine equipped with online and offline tool wear sensors, online adjustment and different coefficients of variation. 4

Results and analysis

To get significant results, an experiment comprises 10,000 part types simulated for each of the following cases: part types are separated into batches of 5, 10 or 20 and also the magazine size (0, 1, 2, 3, 4 and 5 times the batch size). The processing time of each part type is computed first from the deterministic model and is then compared with experimental results. For simulations, the processes are random, so each time the program is run, we will get slightly different results. The higher the number of part types being simulated in an experiment, the lower the difference in the results. To see this, let

x be the sample mean and s the sample standard deviation of the observed values, then the interval is a function of the number of part types simulated, noted I, and is given by

x ± z 0.025

s . 10 000 / I

In that case, there is an approximate 95% confidence interval estimate of the expected increase between the deterministic model and the simulation of that model (see Ross (2002) for details). Table 2 presents results from some experiments according to magazine size. Notice that the error margin is between 0.42 and 1.46% 19 times out of 20. In fact, the interval is, on average, less than 7% the sample mean, which is quite accurate.

19

0 5 10 15 20 25 χ x 16.97% 31.13% 37.17% 40.98% 46.13% 48.75% S 21.46% 55.48% 59.23% 61.76% 68.30% 74.39% 1.96s×51/2/(100) 0.94% 2.43% 2.60% 2.71% 2.99% 3.26% Table 2. Interval estimates according to magazine size when batch size = 5. An experiment also requires having one of the three tool wear monitoring systems: nosensor, online sensors or offline sensors; a statistical tool life distribution: gamma, normal, lognormal or Weibull; as well as a constant coefficient of variation: 0.3 or 0.1. If selecting one or another of those parameters (for example, selecting ξ = 0.3 or 0.1) has limited impact on the results, only one is chosen to avoid redundancy in subsequent analysis. The results are presented according to tool replacement policies: online sensor, then no sensor and, finally, offline sensor. Afterwards, results and analysis of an online cutting speed adjustment strategy are presented. 4.1

Online sensors case

Table 3 shows the average percentage increase† in processing time between the deterministic optimal processing time and what is expected from that optimal setting obtained when a machine tool is equipped with online sensors and the coefficient of variation is 0.3. Table 4 shows the same settings but for a coefficient of variation of 0.1. Those two tables allow us to compare all four statistical distributions. First, note that the lognormal yields generally higher percentage increases, followed by the gamma; the Weibull shows the smallest differences. For a given coefficient of variation, batch size and tool magazine size, the highest difference between one distribution and another is six percent, which is less than the error margin previously obtained. In general, the difference is about three percent. Since the choice of the tool life distribution does not seem to be an important factor for all practical purposes, we arbitrarily selected the gamma distribution as the tool life distribution for the analysis of other cases.



If Pstoch and Pdet are respectively the processing times estimated at v* from the simulation and from the deterministic model, then the percentage increase is computed as follows: (Pstoch – Pdet)/Pdet.

20

Results from the gamma experiments of Table 3 and Table 4 are depicted graphically in Figure 4. By examining this last figure, one can quickly realize that the difference between the deterministic model and the simulated runs is quite noticeable. In fact, it varies between 16 and 50% in increased production time when coefficient of variation is 0.3, and 14 to 44% when ξ = 0.1. Surprisingly, a lower coefficient of variation does not yield a much lower percentage gap, especially when magazine size is small. It becomes a little more noticeable when many tool slots are available for a part type, a decrease ranging between 4.6 and 8.6% for the largest tool magazine tested of each batch size. This may be due to the high penalty cost for extra tool setup. On a different note, the variation between having 5, 10 or 20 part types is not very important – less than two percent, generally not even one percent. In fact, from the tables presented here, the average number of tool slots available per part type seems to be the only major factor that really affects how far realistic cutting scenarios are from the deterministic plan.

20 part types

10 part types

5 part types

0 5 10 15 20 25 χ Gamma 16.97% 31.13% 37.17% 40.98% 46.13% 48.75% Normal 16.11% 29.86% 34.54% 39.91% 43.36% 48.58% Lognormal 17.92% 31.98% 38.73% 41.41% 46.44% 50.33% Weibull 16.03% 29.35% 34.92% 38.44% 43.93% 48.97% 0 10 20 30 40 50 χ Gamma 16.61% 31.39% 36.49% 40.41% 44.45% 47.65% Normal 16.26% 29.04% 34.58% 38.85% 42.32% 46.49% Lognormal 16.94% 31.90% 36.80% 41.40% 45.73% 48.64% Weibull 15.87% 29.08% 33.73% 38.92% 42.53% 46.25% 0 20 40 60 80 100 χ Gamma 17.11% 30.60% 36.65% 38.92% 43.95% 47.23% Normal 15.88% 29.56% 33.89% 38.26% 42.78% 47.61% Lognormal 17.26% 31.59% 37.26% 41.18% 44.25% 48.63% Weibull 15.85% 29.48% 34.60% 37.53% 42.03% 47.71% Table 3. Percentage increase in processing time over the deterministic case when ξ = 0.3 (online sensors). Tool magazine size is the variable showing the greatest impact on processing times. One possibility to explain this disparity between deterministic and simulated processing times may reside in the fact that the deterministic case tends to have fewer manual tool setups 21

needed when having large tool magazine. In that case, we will observe lower deterministic total processing time. Consequently, the impact of having extra tool setups when simulating will show bigger percentage increases. To illustrate this, let’s use the data in Figure 1. On the one hand, without a tool magazine, total deterministic processing time is 900 (600 sec + 3 × 100 sec), if the simulation requires one extra tool setup, hence an increase of 11%. On the other hand, if χ = 3, then deterministic processing time is only the cutting time, i.e. 600 sec and an extra tool setup during simulation will yield an increase of 17%. Another reason that might explain the importance of the tool magazine is that more tools are used when magazine sizes are bigger, in which case, greater variability in cutting time can be observed.

20 part types

10 part types

5 part types

0 5 10 15 20 25 χ Gamma 15.89% 29.32% 34.66% 37.81% 40.74% 43.68% Normal 15.76% 28.86% 33.70% 36.94% 38.34% 42.88% Lognormal 16.28% 30.74% 34.59% 37.57% 40.61% 43.62% Weibull 14.24% 26.70% 30.88% 34.14% 37.15% 40.49% 0 10 20 30 40 50 χ Gamma 15.60% 29.14% 33.78% 37.57% 39.00% 41.46% Normal 15.49% 28.07% 32.51% 36.71% 39.35% 41.88% Lognormal 16.46% 29.12% 34.78% 38.76% 40.12% 42.53% Weibull 14.35% 25.26% 30.26% 32.67% 37.10% 38.23% 0 20 40 60 80 100 χ Gamma 16.37% 29.63% 35.06% 36.20% 40.03% 41.84% Normal 15.63% 29.09% 33.25% 35.00% 38.09% 42.12% Lognormal 16.28% 29.06% 34.88% 35.56% 40.54% 42.72% Weibull 13.95% 25.99% 31.10% 31.94% 36.37% 39.11% Table 4. Percentage increase in processing time over the deterministic case when ξ = 0.1 (online sensors). We can also note that larger magazine sizes yield bigger percentage increase. One possible explanation is that since more tools are most probably used to machine a given part type, this will result in shorter total processing time where the influence of any unplanned manual tool setup will have a greater impact than on a higher values of total processing time observed for smaller magazine sizes. We can also note that as more tools

22

are used, we will observe greater variability in the sum of tool lives, which can increase the number of tool setups.

Figure 4. Influence of tool magazine capacity for a given coefficient of variation and number of part types. 4.2

No-sensors case

We can expect worse results when no equipment can help detect tool wear. Table 5 confirms that thought, where the gap between the model and the simulation rises up over 17% in the case of a machine without magazine. For the largest magazine size tested, the divergence reaches 68% with ξ = 0.1 and more than 85% with ξ = 0.3. It is hard to be on schedule with such delays. These figures highlight the importance of having a sensor system. In fact, it demonstrates that a machine equipped with sensors can improve productivity by as much as 33% over a standard machine tool‡. However, when the tool



If Pols and Pns are respectively the processing times obtained from the simulation of a machine equipped with online sensors and a traditional machine, then the percentage increase is computed as follows: (Pns – Pols)/Pols.

23

magazine is not used, the improvement is not as substantial, down to around 2% in higher productivity. Investing in a sensor system seems to become an important factor when the machines are equipped with a tool magazine. The time saved using a machine with a sensor system is very significant when compared to a normal machine tool. If the machine operates for a minimum of 2 000 hours per year (50 weeks at 40 hours a week), the firm could save between 40 and 660 hours using online sensors. If machines are working continuously, the time saved can reach up to 2890 hours (365 days × 24 h/day × 33%). Investment in such a system may easily be justified through a cost analysis. The online sensor policy shows the best results since tools are fully used, as opposed to other policies where some possible useful tool lives are discarded. We can expect results from the offline sensor policy to be somewhere in between the first two. Before presenting the experiment of a machine equipped with offline sensors, there are two important observations that will allow us to simplify the simulation with regards to the parameters used. First, in both previous analyses, the coefficient of variation of 0.1 yielded significant differences between the deterministic model and the simulated processing times. Therefore, even with better tool quality, the need for a stochastic model is justified. For that reason, the analysis of this scenario will concentrate on the suggested coefficient of variation of 0.3. Second, note that the computed gap between theoretical and simulated results shows little variaiton in view of the numbers of part types selected to be processed, to say 5, 10 or 20. In the first two policies presented, the worst case is a large tool-to-part-type ratio on a traditional machine, in which case the gap reaches 3%. Consequently, we will analyze the offline sensors and dynamic adjustment scenarios arbitrarily based on sets of 5 part types.

24

Coefficient of variation = 0.3 Tool slot Part type 5 part types 10 part types 20 part types

0

1

19.00% 18.65% 19.24%

34.83% 34.90% 34.16%

2

3

4

5

45.61% 56.66% 71.64% 85.39% 43.65% 54.03% 67.21% 80.75% 43.69% 51.86% 64.66% 79.34% Coefficient of variation =0.1 5 part types 17.58% 32.62% 41.97% 50.76% 59.16% 68.12% 10 part types 17.40% 32.25% 39.92% 49.05% 55.81% 64.08% 20 part types 17.99% 32.60% 41.20% 46.55% 55.32% 63.11% Table 5. Percentage increase over the deterministic case when no sensor is used. 4.3

Offline sensor case

Now, the final tool replacement policy simulated concerns an offline sensor system. If we suppose the wear constant after a given ‘break in’ phase, then one only needs to test the condition of the tool at the end of that phase to determine with exactitude its useful life (see Wiklund (1998) for an empirical test). The resulting processing time will be the same as for the online policy, except for the inspection time, i.e. the number of tools used multiplied by a check up time that we suppose constant. However, prediction of tool wear can be quite complicated due to the complexity of the machining system. The cutting process is often under extreme conditions with temperatures at the cutting edge that can exceed 1800°F and pressure that is greater than 2000 psi (Xie et al., 2005). More generally, we need to set an inspection schedule that will find the best tradeoff between corrective (high downtime cost) and preventive maintenance (high inspection and extra tool setup costs). Hence we need to define the inter-inspection times and the threshold that will minimize overall processing time. Total processing time is computed as the sum of cutting, inspection, delay and setup times. Note that if inspection time is supposed to be negligible, then tool condition will be monitored as often as possible, i.e. inter-inspection intervals will tend towards zero and instead we have an online sensor system. A constant time for each inspection was determined in Jeon and Kim’s paper (1988). In fact, they used equipment capable of detecting tool wear within 1.7 seconds of required processing time. We can expect that 25

sensors are now more efficient to rapidly detect tool condition. However, in the absence of such known studies, we will adopt this value. In the case where inter-inspection time is constant, we simply simulated various threshold and inter-inspection time values. Values for a, which defines inter-inspection times, range between 0.05 and 0.25 and thresholds were tested for values between 0.1 and 0.8. Figure 5 shows an example of results obtained from the tested values, when χ = 0. Note that the grey area represents values for which ε does not exist for the given a. The closer we are from that area, the more inspections are scheduled before reaching the threshold. We clearly see that, in that case, the optimal is reached at ε = 0.5 and a = 0.11. However, there is a wide diagonal region where we get a fairly low value as well. This region has a maximum number of inspections that is fairly high, which means that the tool is often used to its full economic life. Optimal scenarios for all tool magazine sizes are presented in Table 6. The last two columns present respectively the number of inspections needed to reach the threshold and the probability that the tool life will last beyond the last inspection. No matter the magazine size, the optimal solution is found when the threshold yields a fairly low probability for the tool to last beyond that point. This probability will decrease as the magazine size gets larger. With high values found for the threshold suggesting using the tools up to its critical wear limit, it is not surprising that the interinspection time is short, between 0.11 and 0.21 of the expected tool life. This might be due to the very low inspection cost. Notice that even with the penalties (delay and inspection times), offline sensors still yield better results over the no-sensor case when larger tool magazines are used. Without the penalties, the results are fairly close to the online sensor case, which is to be expected, since tools are used close to their total economic life. Analysis of penalty cost when catastrophic failure occurs could also discriminate between sensor types used. Finally, notice that larger tool magazines also seem to imply larger inspection and/or delay costs. Those results could be improved by reducing the number of inspections, especially when a tool is newly used. This can be done with variable inspection times.

26

Figure 5. Iso-level curves of the percentage increase over the deterministic case when offline sensors are used (χ = 0). InterCutting + Max number Cutting + inspection Threshold of inspect. χ Setup + delay P(X > aYtl) Setup + inspection Y ε a = δ/ tl 0 20.96% 20.17% .11 .5 19 2×10-3 5 36.83% 34.83% .15 .6 14 2×10-3 10 43.98% 40.75% .11 .5 19 2×10-3 15 50.31% 46.12% .13 .6 19 1×10-4 20 56.09% 50.55% .17 .7 15 8×10-5 25 63.09% 55.87% .21 .8 15 9×10-7 Table 6. Percentage increase over the deterministic case when offline sensors are used with constant inter-inspection times. The variable inspection schedule uses predetermined times for all possible inspections and it is fixed as a fraction of the expected tool life. This ‘preset’ schedule is very simple and does not require perfect knowledge of the tool status. Even when compared with the fixed schedule (see the second column in Table 7), the disparity between fixed and variable schedules is no more than 3.73%. This could be due to the fact that delay and inspection costs are fairly low compared to cutting and setup costs. As expected, optimal schedules for the offline sensor system generally fall in between the online sensor and the no-sensor cases. The only exception is when there is no 27

tool magazine, even though very close at 0.71%. The offline sensor system becomes advantageous as magazine size increases – close to 25% when there are 25 tool slots for processing 5 part types. Similarly, the gap also widens between online and offline sensors systems – reaching just over 12% for the biggest tool magazine tested. Recall that the variables for the inspection schedule are all set to values rounded to the first decimal. Refining those values may yield superior results. Offline sensors Online No sensor sensors Variable schedule Fixed schedule 0 19.71 20.96 16.97 19.00 5 34.39 36.83 31.13 34.83 10 41.76 43.98 37.17 45.61 15 46.58 50.31 40.98 56.66 20 53.76 56.09 46.13 71.64 25 60.84 63.09 48.75 85.39 Table 7. Percentage increase over deterministic case (lot of 5 part types).

χ

Table 8 presents the preset schedules according to the tool magazine size that gave the lowest processing time through the simulation. As can be noticed, a bigger tool magazine allows for more time for processing before the first inspection. This results in a lower number of inspections. Also note that for the schedules presented, the threshold is set to a constant maximum value of 0.463. The probability of a tool lasting over the last scheduled inspection is 0.0035.

28

Inspection 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Av. num. inspect.

χ=0 0.3 0.2 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

χ=5 0.4 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1

χ = 10 0.5 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0

χ = 15 0.5 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0

χ = 20 0.6 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0 0

χ = 25 0.7 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0 0 0 0 0

4.88

4.81

4.26

5.32

3.96

2.81

Table 8. Preset inspection schedule 4.4

Dynamic adjustment

We also wanted to test updating the cutting speed as new information on work remaining becomes available. Hence, after each tool replacement, assuming the presence of an online sensor system, deterministic re-optimization is carried out. Furthermore, a maximum cutting speed of 5 m/s is arbitrarily set§. However, as mentioned earlier, reoptimizing with the deterministic model can lead to bad results, since we can expect extra tool setups. Percentage increases over the deterministic case are shown below for various tool magazine capacities: χ Increase (%)

0 29.16

5 55.48

10 62.23

15 67.38

20 74.43

25 77.17

Only the largest tool magazine tested yield a better result over the no sensor case. For this method to be efficient, it should improve the online sensor case. One way to improve the

§

Simulating with parameters from Table 1 will yield optimal cutting speeds ranging between .11 and 4.6 m/s.

29

results would be to reduce cutting speed so as to avoid costly tool setup. However, this change was only implemented when, for a given part type, the last tool is scheduled, thus reducing the chance for setting an extra tool. The reduction is set as a percentage of the optimal deterministic cutting speed. Table 9 presents the percentage increase in processing time when simulating the online adjustment in a stochastic environment over the optimal deterministic case. In this table, the percentage increase is a function of the percentage of the optimal deterministic cutting speed used for the last planned tool and the tool magazine size (for batches of five part types). The lowest percentage increase for each tool magazine size tested is shown in bold. The error on the estimation is again about 3% from the values obtained. We notice a clear improvement when compared with static schedules. The greatest improvement can be achieved when tool magazine is large. When χ = 25, the percentage increase drops from 48.75 to 14.70, that’s on average a 34% reduction in processing time. We can also notice that as tool magazine increases in size, the percentage of the optimal deterministic cutting speed tends to decrease. This could be explained by the fact that more tools are used, resulting in greater variability in the sum of tool lives. For all magazine sizes tested, the best results are obtained when reducing cutting speed of the last planned tool of a given part type by 13 to 20%. % of v* χ=0 χ=5 χ = 10 χ = 15 χ = 20 χ = 25 0.79 13.13 18.71 19.33 18.00 16.20 14.78 0.8 12.57 18.19 18.70 17.37 16.17 14.70 0.81 12.03 17.52 17.90 16.99 16.09 14.81 0.82 11.68 16.63 17.97 17.08 14.84 15.83 0.83 11.18 16.15 17.46 16.94 15.90 15.91 0.84 10.98 16.08 17.05 16.87 16.15 15.92 0.85 10.57 15.47 17.12 16.59 15.79 16.79 0.86 10.40 17.17 16.80 17.40 15.36 16.71 0.87 15.72 16.93 17.41 17.22 17.71 10.18 0.88 10.61 15.73 17.73 17.80 19.46 18.32 0.89 10.82 16.12 18.01 19.75 19.95 20.75 0.9 10.89 16.44 18.72 20.34 21.05 22.28 Table 9. Percentage increase in processing time over the deterministic case for the online adjustment.

30

5

Conclusion

The simulation presented in this paper shows that significant improvements can be made in tool management. A stochastic model, for example, could help the manager set realistic production goals while using the machine tools as effectively as possible. Yet, many researchers and managers use deterministic tool lives to schedule or simulate production processes. Moreover, the use of a sensor system can greatly improve throughput times by using the whole economical tool life duration of a given tool or at least by better estimating tool wear. Systems using acoustic, vision or radioactive sensors already exist and can be used to track tool wear without stopping the machine. Using a gamma distribution with a coefficient of variation of 0.3, the simulation resulted in a reduction of as much as 36% of the total processing time for the production of the same set of part types when using online sensors as opposed to using the operator’s judgment. The offline sensor case yields fairly good results, only 12% over the online case for the larger tool magazine and less than 3% for a machine without magazine. Optimal parameters for an inspection schedule were simulated. However, constant or variable inspection schedules are fairly similar, that is to say less than 4%. Still the results are far from those of the deterministic case, the online sensors’ scenario yielding as much as 48% of extra production time. In other words, for an eight-hour shift, if the manager tries to optimize using a deterministic model, chances are that close to four hours of overtime will be needed in order to complete the planned production. Moreover, using this strategy, the cutting speed remains constant throughout the processing of a given part type. For a large tool magazine, many duplicates of a tool type might be preloaded for processing a single part type. With high variability, the actual result can be very different from the planned result. One possible way to keep production under control is to adjust cutting speed after each tool change. A deterministic dynamic strategy was tested. Results improved quite considerably with a minor adjustment. The improvement ranges from 6.8% when no tool magazine is used to 34% for the largest magazine size tested. One adjustment made to the model was to reduce cutting speed for the last planned tool when processing a given part type so as to avoid a costly tool setup.

31

Through simulation, we found that reducing cutting speed by as little as 13% can help reduce total processing time. Interesting observations could be made from the present work. First, reducing tool life variability will only slightly improve results. Even with more reliable tools, the need for stochastic analysis seems justified. Second, most commonly used tool life distributions yield similar results in our case. Even though the gamma distribution is probably the most interesting to use, since it can be easily manipulated algebraically, they are all equivalent, based on the expected processing time. Third, batch size does not seem relevant in our analysis. However, magazine size greatly influences results. Finally, the study also concludes that using a sensor system can greatly improve the performance of the system. In fact, it is the greatest source of improvement. Interest in future work could tackle a stochastic model for minimizing processing time, a sensitivity analysis on the inspection time, on the setup time and on the coefficient of variation (for what value of ξ can we get a reliable deterministic model). 6

References

Barlow, R.E., Hunter, L.C., and Proschan, F., Optimum Checking Procedures, Journal of the Society for Industrial and Applied Mathematics, 1963, 4,1078--1095. Barlow, R.E. and Proschan, F., Mathematical Theory of Reliability, 1965 (John Wiley & Sons). Colbaugh, R. and Glass, K., Real-Time Tool Wear Estimation Using Recurrent Neural Networks, Proceedings 10th IEEE International Symposium on Intelligent Control, Monterey, CA., 1995 Dernardo, E.V., Dynamic Programming: Models and applications, 1982 (Prentice-Hall, Inc., New York). Kaspi, M. and Shabtay, D., Optimization of the machining economics problem for multistage transfer machine under failure, opportunistic and integrated replacement strategies, International Journal of Production Research, 2003, 41(10), 2229--2247. Kurada, S. and Bradley, C., A Review of Machine Vision Sensors for Tool Condition Monitoring, Computer in Industry, 1997, 34, 55--72. Jeon, J.U. and Kim, S.W., Optical Flank wear Monitoring of Cutting Tools by Image Processing, Wear, 1988, 127, 207--217. Lamond, B.F. and Sodhi, M.S., Using Tool Life Models to Minimize Processing Time in a Flexible Manufacturing System, IIE Transactions, 1997, 29, 611--621. Liu, P.H., Makis, V. and Jardine, A.K.S., Scheduling of the optimal tool replacement times in a flexible manufacturing, IIE Transactions, 2001, 33(6), 487--495. 32

Ramalingam, S. and Watson, J.D., Tool-Life Distributions. Part 1: Single-Injury ToolLife Model, Journal of Engineering for Industry, Transactions of ASME, August 1977, 519--522. Ramalingam, S., Tool-Life Distributions. Part 2: Multiple-Injury Tool-Life Model, Journal of Engineering for Industry, Transactions of ASME, August 1977, 523--530. Rosetto, S. and Zompi, A., A Stochastic Tool-Life Model, Journal of Engineering for Industry, Transactions of ASME, 1981, 103, 126--130. Ross, S.M., Simulation, 3rd edition, 274 p., 2002 (Academic Press). Sandvik Coromant, CoroKey, Your guide to productivity, 6th edition, 273 p., 2000. Sodhi, M.S., Lamond, B.F., Gautier, A. and Noël, M., Heuristics for determining economic processing rates in a flexible manufacturing system, EJOR, 2001, 129(1), 105--115. Song, D.P., Hicks, C. and Earl, C.F., Dynamic production planning and rescheduling for complex assemblies, 2003, retrieve August 2004 from http://www-mmd.eng.cam.ac.uk/mcn/pdf_files/part6_10.pdf. Taylor F.W., On the art of cutting metals, Trans. ASME, 1907, 28. Wager, J.G. and Barash, M.M., Study of the Distributions of the Life of HSS Tools, Journal of Engineering for Industry, Transactions of ASME, November 1971, 1044-1050. Wang, K.S., Lin,W.-S. and Hsu F.-S., A New Approach for Determining the Reliability of a Cutting Tool, International Journal of Advanced Manufacturing Technology, 2001, 17, 705--709. Wiklund, H. , Bayesian and Regression Approaches to On-Line Prediction of Residual Tool Life, Quality Reliability Engineering International, 1998, 14, 303--309. Xie, L.-J., Schmidt, J., Schmidt, C. and Biesinger, F., 2D FEM estimate of tool wear in turning operation, Wear, 2005, 258, 1479–1490. Acknowledgements. This research was supported in part by the National Science and Engineering Research Council of Canada, under Grant 0105560. The authors are also thankful to the National Science Foundation for supporting this research through grant DMII: 9813177.

33