Jul 7, 2006 - Edmund K Burke, Member, IEEE, Matthew Hyde, Graham Kendall, Member, IEEE, and John Woodward. AbstractâWe present a genetic ...
School of Computer Science and Information Technology University of Nottingham Jubilee Campus NOTTINGHAM NG8 1BB, UK
Computer Science Technical Report No. NOTTCS-TR-2008-2
A Genetic Programming Hyper-Heuristic Approach for Evolving Edmund Burke, Matthew Hyde Graham Kendall and John Woodward
First released: 2008
© Copyright 2008 Edmund Burke, Matthew Hyde, Graham Kendall and John Woodward
In an attempt to ensure good-quality printouts of our technical reports, from the supplied PDF files, we process to PDF using Acrobat Distiller. We encourage our authors to use outline fonts coupled with embedding of the used subset of all fonts (in either Truetype or Type 1 formats) except for the standard Acrobat typeface families of Times, Helvetica (Arial), Courier and Symbol. In the case of papers prepared using TEX or LATEX we endeavour to use subsetted Type 1 fonts, supplied by Y&Y Inc., for the Computer Modern, Lucida Bright and Mathtime families, rather than the public-domain Computer Modern bitmapped fonts. Note that the Y&Y font subsets are embedded under a site license issued by Y&Y Inc. For further details of site licensing and purchase of these fonts visit http://www.yandy.com
1
A Genetic Programming Hyper-Heuristic Approach for Evolving Two Dimensional Strip Packing Heuristics Edmund K Burke, Member, IEEE, Matthew Hyde, Graham Kendall, Member, IEEE, and John Woodward
Abstract—We present a genetic programming hyper-heuristic system to evolve a ‘disposable’ heuristic for each of a wide range of benchmark instances of the two-dimensional strip packing problem. The evolved heuristics are constructive, and decide both which piece to pack next and where to place that piece, given the current partial solution. Usually, there is a trade-off between the generality of a packing heuristic and its performance on a specific instance. This hyper-heuristic approach is particularly beneficial because it produces a heuristic tuned to the instance, and it is general enough that it can do so for any strip packing instance, with no change of parameters. The contribution of this paper is to show that the genetic programming hyper-heuristic can be employed to automatically generate heuristics which are better than the human-designed state of the art constructive heuristics, in a very well studied area. Index Terms—Genetic programming, hyper-heuristics, twodimensional stock cutting
I. I NTRODUCTION This paper presents a hyper-heuristic genetic programming system for the two-dimensional strip packing problem, where a number of rectangles must be placed onto a sheet with the objective of minimising the length of sheet that is required to accommodate the items. The sheet has a fixed width, and the required length of the sheet is measured as the distance from the base of the sheet to the piece edge furthest from the base. This problem is known to be NP hard [1], and has many industrial applications as there are many situations where a series of rectangles of different sizes must be cut from a sheet of material (for example, glass or metal) while minimising waste. Indeed, many industrial problems are not limited to just rectangles (for example textiles, leather, etc) and this presents another challenging problem [2]. There are many other types of cutting and packing problems in one, two and three dimensions. A typology of these problems is presented by Wascher et al. in [3]. As well as their dimensionality, the problems are further classified into different types of knapsack and bin packing problems, and by how similar the pieces are to each other. In this work, the rectangular pieces are free to rotate by 90 degrees, and there can be no overlap of pieces. The guillotine version of this problem occurs where the cuts to the material can only be made perpendicular to an edge, and must split the sheet into two pieces, then those same cutting rules apply Edmund K Burke, Matthew Hyde, Graham Kendall, and John Woodward are with the ASAP research group, in the School of Computer Science and IT, Nottingham University, Nottingham, UK. Email [ekb,mvh,gxk,jrw]@cs.nott.ac.uk
recursively to each piece. However, we are interested here in the non-guillotine version of the problem, which has no such constraint on how the pieces are cut. The motivation behind this work is to develop a hyperheuristic genetic programming methodology which can automatically generate a novel heuristic for any new problem instance. This has the potential to eliminate the time consuming process of manual problem analysis and heuristic building that a human programmer would carry out when faced with a new problem instance or set of instances. Work on automatic heuristic generation has not been presented before for this problem. However, similar work on other problem domains has been published (see section II-B). Hyper-heuristics are defined as heuristics which search a space of heuristics, as opposed to searching a space of solutions directly [4] (which, of course, is the conventional approach to employing evolutionary algorithms). We employ genetic programming as a hyper-heuristic to search the space of heuristics that it is possible to construct from a set of building blocks. The output is a heuristic which is applied to the solution space to produce a solution. We show that our hyper-heuristic designs and builds heuristics which are competitive with human designed heuristic and metaheuristic approaches. Also, the system is general enough to do so without any change in parameters. Any problem instance can be given to the hyper-heuristic and a heuristic will be produced that obtains a solution. There are two components to any constructive heuristic used for the two dimensional strip packing problem. Many of the heuristics created by humans are reliant on the presented order of the pieces before the packing begins. Often, the pieces are pre-ordered by size, which can achieve better results [5]. However, it is not currently possible to say that this will result in a better packing, in the general case, than a random ordering. Metaheuristics have been successfully employed to generate a good ordering of the pieces before using a simple placement policy to pack them [6], [5]. These hybrid metaheuristic approaches have shown that it is possible for one heuristic to gain good results on a wider range of instances because of the ability to evolve a specific ordering of the pieces for a given instance. However, they are still limited by the fact that their packing heuristic may not perform well on the instance regardless of the piece order, which would make it difficult for the hybrid approach to find a good solution. The heuristics we evolve do not suffer from the same limitations. As we will show, they decide which piece to place next in the partial solution and where to place it. So
2
the evolved heuristics’ performance is independent of any piece order. In section II, we introduce the background literature on genetic programming, hyper-heuristics, and 2D strip packing approaches. Section III presents our algorithm. Section IV describes the benchmark problem instances used in this paper, and section V presents the results we have obtained for those benchmark instances and compares them to recent results in the literature. Finally, conclusions and ideas for future work are given. II. BACKGROUND A. Genetic Programming Genetic programming (GP) (see [7], [8], [9]) is a technique used to evolve populations of computer programs represented as tree structures. An individual’s performance is assessed by evaluating its performance at a specific task, and genetic operators such as crossover and mutation are performed on the individuals between generations. A list of GP parameters used in this paper is given in section III-D. B. Hyper-Heuristics Hyper-heuristics are defined as heuristics that search a space of heuristics, as opposed to searching a space of solutions directly [4]. Research in this area is motivated by the goal of raising the level of generality at which optimisation systems can operate [10], and by the assertion that in many real-world problem domains, there are users who are interested in “good-enough, soon-enough, cheapenough” solutions to their search problems, rather than optimal solutions [10]. In practice, this means researching systems that are capable of operating over a range of different problem instances and sometimes even across problem domains, without expensive manual parameter tuning, and while still maintaining a certain level of solution quality. Many existing metaheuristics have been used successfully as hyper-heuristics. Both a genetic algorithm [11] and a learning classifier system [12] have been used as hyperheuristics for the one-dimensional bin packing problem. A genetic algorithm with an adaptive length chromosome and a tabu list was used in [13] as a hyper-heuristic. A case based reasoning hyper-heuristic is used in [14] for both exam timetabling and course timetabling. Simulated annealing is employed as a hyper-heuristic in [15] for the shipper rationalisation problem. A tabu search hyper-heuristic [16] is shown to be general enough to be applied to two very different domains: nurse scheduling and university course timetabling. A graph based hyper-heuristic for timetabling problems is presented in [17]. Three new hyper-heuristic architectures are presented in [18], treating mutational and hill climbing low-level heuristics separately. A choice function has also been employed as a hyper-heuristic, to rank the low-level heuristics and choose the best [19]. A distributed choice function hyper-heuristic is presented in [20]. The common theme to the hyper-heuristic research mentioned above is that all of the approaches are given a set of
low-level heuristics, and the hyper-heuristic chooses the best one or the best sequence from those. Another class of hyperheuristic, which has received less attention in the literature, generates low-level heuristics from a set of building blocks given to it by the user. The aim of this class of hyperheuristic is to create a low-level heuristic from these building blocks, which will be either re-usable on new problems, or ‘disposable’, in the sense that they will only be evolved for the purpose of finding a solution to a particular instance. There are two main examples of this class of hyper-heuristic. The ‘CLASS’ system presented in [21],[22] and [23] is an automatic generator of local search heuristics for the SAT problem, and is competitive with human-designed heuristics. Simple one-dimensional bin packing heuristics are evolved in [24], which match the performance of the human-designed first-fit heuristic. C. 2D Stock Cutting Approaches 1) Exact Methods: Gilmore and Gomory [25] first used a linear programming approach in 1961 to solve small size problem instances. Tree search procedures have been employed more recently to produce optimal solutions for small instances of the 2D guillotine stock cutting problem [26] and 2D non-guillotine stock cutting problem [27]. The method used in [26] has since been improved in [28] and [29]. It is recognised that these methods cannot provide good results for large instances. For example, the largest instance used in both [28] and [29] is 60 pieces. For larger instances, therefore, a heuristic approach must be introduced, which cannot guarantee the optimal solution, but can produce high quality results for much larger problems. 2) Heuristic Methods: Baker et al. define a class of packing algorithms named ‘bottom up, left justified’ (BL) [30]. These algorithms maintain bottom left stability during the construction of the solution, meaning that all the pieces can not be moved any further down or left from where they are positioned. The heuristic presented in [30] has come to be named ‘bottom-left-fill’ (BLF) because it places each piece in turn into the lowest available position, including any ‘holes’ in the solution, and then left justifying it. While this heuristic is intuitively simple, implementations are often not efficient because of the difficulty in analysing the holes in the solution for the points that a piece can be placed [31]. Chazelle presents an optimal method for determining the ordered list of points that a piece can be put into, using a ‘spring’ representation to analyse the structure of the holes [31]. These heuristics take, as input, a list of pieces, and the results rely heavily on the pieces being in a ‘good’ order [30]. Theoretical work presented by Brown et al. [32] shows the lower bounds for online algorithms both for pre-ordered piece lists by decreasing height and width, and non preordered lists. Results in [5] have shown that pre-ordering the pieces by decreasing width or decreasing height before applying BL or BLF results in performance increases of between 5% and 10%.
3
Recently, a ‘best-fit’ style heuristic was presented in [33]. In this algorithm, the lowest available space on the sheet is selected, and the piece which best fits into that space is placed there. This algorithm is shown to produce better results than previously published heuristic algorithms on benchmark instances [33]. The best-fit heuristic packs the pieces in a similar way to the heuristics we evolve in this paper, because all the pieces are considered for packing, not just the first in the sequence given to it. However, in contrast, best-fit only considers one space for the pieces to go, and always packs the selected piece on a pre-determined side of the space. The heuristics evolved in this paper consider all spaces and both sides of each space. Zhang et al [34] use a recursive algorithm, running in O(n3 ) time, to create good strip packing solutions, based on the ‘divide and conquer’ principle. Finally, two heuristics for the strip cutting problem with sequencing constraint are presented by Rinaldi and Franz [35], based on a mixed integer linear programming formulation of the problem. 3) Metaheuristic Methods: Metaheuristics have been successfully employed to evolve a good ordering of pieces for a simple heuristic to pack. For example, Jakobs [6] uses a genetic algorithm to evolve a sequence of pieces for a simpler variant of the BL heuristic. This variant packs each piece by initially placing it in the top right of the sheet and repeating the cycle of moving it down as far as it will go, and then left as far as it will go. Liu and Teng [36] proposed a simple BL heuristic to use with a genetic algorithm that evolves the order of pieces. Their heuristic moves the piece down and to the left, but as soon as the piece can move down it is allowed to do so. However, using a BL approach with a metaheuristic to evolve the piece order is somewhat limited, for example it is shown in [30], [37] that, for certain instances, there is no sequence that can be given to the BLF heuristic that results in the optimal solution. Ramesh Babu and Ramesh Babu [38] use a genetic algorithm in the same way as Jakobs, to evolve an order of pieces, but use a slightly different heuristic to pack the pieces, and different genetic algorithm parameters, improving on Jakobs’ results. Simulated annealing is used in [39], to evolve a piece order for the best-fit algorithm of [33]. Valenzuela and Wang [40] employ a genetic algorithm for the guillotine variant of the problem. They use a linear representation of a slicing tree as the chromosome. The slicing tree determines the order that the guillotine cuts are made and between which pieces. The slicing trees bear a similarity with the genetic programming trees in this paper, which represent heuristics. The slicing trees are not heuristics however, they only have relevance to the instance they are applied to, while a heuristic dynamically takes into account the piece sizes of an instance before making a judgement on where to place a piece. If the pattern of cuts dictated by the slicing tree were to be applied to a new instance, the pattern does not consider any properties of the new pieces. For example, if the slicing tree defines a cut between piece one and piece nine, then this cut will blindly be made in the
new instance even if these pieces now have wildly different sizes. A heuristic would consider the piece sizes and the spaces available before making a decision. Hopper and Turton [5] compare the performance of several metaheuristic approaches for evolving the piece order, each with both the BL constuctive algorithm of Jakobs [6], and the BLF algorithm of [30]. Simulated annealing, a genetic algorithm, naive evolution, hill climbing, and random search are all evaluated on benchmark instances, and the results show that better results are obtained when the algorithms are combined with the BLF decoder. The genetic algorithm and BLF decoder (GA+BLF) and the simulated annealing approach with BLF decoder (SA+BLF) are used as benchmarks in this paper. Other approaches start with a solution and iteratively improve it, rather than heuristically constructing a solution. Lai and Chan [41] and Faina [42] both use a simulated annealing approach in this way, and achieve good results on problems of small size. Also, Bortfeldt [43] uses a GA which operates directly on the representations of strip packing solutions. A reactive greedy randomised adaptive search procedure (reactive GRASP) is presented in [44] for the twodimensional strip packing problem. The method involves a constructive phase and a subsequent iterative improvement phase. To obtain the final overall algorithm, four parameters were chosen with the results from a computational study, using some of the problem sets used in the paper. First, one of four methods of selecting the next piece to pack is chosen. Second, a method of randomising the piece selection is chosen from a choice of four. Third, there are five options for choosing a parameter δ, which is used in the randomisation method, and finally there are four choices for the iterative improvement algorithm after the construction phase is complete. The method is a complex algorithm with many parameters, which are chosen by hand. Belov et al. have obtained arguably the best results in the literature for this problem [37]. Their ‘SVC’ algorithm is based on an iterative process, repeatedly applying one constructive heuristic, ‘SubKP’, to the problem, each time updating certain parameters that guide its packing. The results obtained are very similar to those obtained by the GRASP method. They obtain the same overall result on the ‘C’, ‘N’ and ‘T’ instances of Hopper and Turton, but SVC obtains a slightly better result on ten instance sets from Berkey and Wang, and Martello and Vigo. Together, SVC(SubKP) and the reactive GRASP method represent the state of the art in the literature, and SVC(SubKP) seems to work better for larger instances [37]. We compare with the results of the reactive GRASP in section V, because they represent some of the best in the literature, and their reported results cover all of the data sets that we have used here. It must be noted however that the aims of the hyper-heuristic methodology presented in this paper differ slightly from the aims of other work in the literature. The aim here is to generate constructive heuristics
4
automatically. The quality of the results that the heuristics produce is of high importance, but we do not aim only for better results than the state of the art hand crafted heuristics. Therefore, the contribution of this paper is to show that automatically generated constructive heuristics can obtain results in the same region as the curent state of the art human developed heuristics in two-dimensional strip packing [44], [37], which use a constructive phase and an iterative phase. We also show that the automatically generated constructive heuristics obtain better results than the human designed state of the art constructive heuristic, presented in [33]. III. M ETHODOLOGY In section III-A, we explain the representation of the problem that we use, and how it is updated each time a piece is placed in the solution. Section III-B explains how the heuristic decides which piece to pack next and where to place it. A step by step packing example is given in section III-C to further clarify this process. Section III-D explains how the heuristics themselves are evolved, detailing the GP parameters. A. Representation of the Problem A constructive heuristic considers the strip packing problem to be a sequence of steps, where a piece must be placed at each step. Our hyper-heuristic system evolves a constructive heuristic for a given problem instance. At every step, the heuristic chooses a piece, and the position to place it, according to the state of the sheet and the pieces already placed on it. To this end, the sheet is represented as a set of dynamic ‘bins’, the number and configuration of which will change at every step. Each bin has a height (the distance from the base of the sheet to the base of the bin), a lateral position, and a width. This representation takes its inspiration from the paper by Burke, Kendall and Whitwell [33], which introduces the best-fit heuristic for two dimensional packing. This heuristic selects the lowest available space, and then finds the piece which best fits it. This means the solution is essentially represented as a number of dynamic bins, in order to efficiently find the lowest one. At the beginning of the packing process the sheet will be represented as just one bin, with height zero and width equal to the width of the sheet. As an example, figure 1 shows the bin configuration (three bins, with different heights and widths) when two pieces have been placed onto the sheet in the lower-right and lower-left corners. Figures 1-4 show a step by step example of two more pieces being packed and how the bin structure changes as each piece is packed. There follows a description of the four situations where the bin structure will change. 1) Bin creation: When a piece P is placed in bin Bx , Bx is replaced by two bins, By and Bz . By will form directly on top of P , and Bz will represent what remains of Bx , and will be formed only if W idth(P ) < W idth(Bx ). Formally: Height(By ) = Height(Bx ) + Height(P )
W idth(By ) = W idth(P ) Height(Bz ) = Height(Bx) W idth(Bz ) = W idth(Bx ) − W idth(P ) This process is shown in figs 1-2. 2) Bin Combination: Using the notation from the bin creation section, if the height of By is equal to the height of a neighbouring bin Bn , forming a continuous flat level, then By is joined to Bn , amalgamating them into one bin. This process is shown in fig 4. 3) Raising Narrowest Bin: If Bz is too narrow for any remaining piece to fit in, then Height(Bz ) is increased to the level of its nearest neighbour, and then the bin combination process occurs between the two bins. This is only invoked if Bz is between two higher bins. The space is wasted space, and so can be discarded for the purposes of continuing to construct a solution. If Bz is not between two higher bins, then the space cannot yet be counted as wasted space. This process is shown in figs 2-3. 4) Raising Lowest Bin: If there is no piece that can fit in any bin, the lowest bin in the current solution is raised to the level of its nearest neighbour, and the two bins are combined, forming a wider bin. This is done iteratively until a bin is formed that is wide enough to accommodate at least one piece. B. How the Heuristic Decides Where to Put a Piece This section is a general explanation of the process by which the heuristic decides which piece to pack next and where to put it. This is also summarised in the pseudocode of figure 5, along with the parts of the algorithm that update the bin structure. Section III-C then goes into more detail on this process, using a specific example. In each bin, there are two places that a piece can be put; in the bottom-left corner or in the bottom-right corner. There are also two orientations that the piece can adopt in both corners. Given a partial solution, we will refer to a combination of piece, bin, corner and orientation as an ‘allocation’. Therefore, there is a total of four allocations to consider for each piece and bin combination, provided that the piece’s width in each orientation is smaller than the width of the bin. An allocation therefore represents one of the set of choices (of a piece and where to put it) that a heuristic could make at the given decision step. A heuristic in this hyperheuristic system is a function that rates each allocation. The heuristic is evaluated once for each allocation to obtain a score for each allocation. The heuristic scores an allocation by taking into account a number of features, which are represented as the GP terminals shown in the lower six rows of table I. There are two terminal values describing the piece width and height in its given orientation, two describing the bin, and two describing the bins either side of the allocation bin. ‘Neighbouring’ and ‘opposite’ bins are referred to in table I. A given allocation is either in the lower-left or lowerright corner of the bin, if the allocation is in the lower-left of the bin, the bin to the left of it is referred to as the
5
Fig. 1.
Bin structure after two pieces have been packed
Fig. 3. As no piece can fit into bin three of Fig 2, the base of bin three has been raised, to combine it with bin two
Fig. 2. The middle bin of Fig 1 has been replaced by two new bins due to the third piece being packed into it
Fig. 4. When the fourth piece is packed into the second bin, the heights of bin two and three are the same and they have been combined
WHILE pieces exist to be packed IF narrowest bin is too narrow for any piece AND narrowest bin is between two higher bins THEN raise narrowest bin to level of nearest neighbour ELSE IF at least one piece can fit in a bin FOR each allocation Save best allocation according to heuristic END FOR perform the best allocation on the solution ELSE raise lowest bin END IF update bin structure END WHILE Fig. 5.
Pseudo code showing the overall program structure within which a heuristic operates
6
TABLE I T HE FUNCTIONS AND TERMINALS AND DESCRIPTIONS OF THE VALUES THEY RETURN
Name + * % Width Height Bin Width Left Bin Height Opposite Bin Neighbouring Bin
Label + * % W H BWL BH OB NB
Description Add two inpus Subtract second input from first input Multiply two inputs Protected divide function The width of the piece The height of the piece Difference between the bin and piece widths Bin base’s height, relative to base of sheet Bin height minus height of opposite bin Bin height minus height of neighbouring bin
‘neighbouring’ bin and the bin to the right is the ‘opposite’ bin. Those labels are reversed when the allocation is in the lower-right of the bin. For each possible allocation, the values of the terminals are calculated, and the heuristic is evaluated. The allocation for which the heuristic returns the highest value is deemed to be the best, and therefore that allocation is performed on the solution at the current step. In other words, the piece from the allocation is put in the bin from the allocation, in the corner and orientation dictated by the allocation. This process is shown in the example given in section III-C.
∗
W
@ @ H
% @ @ @ @ −
NB
− @ @ @ @ @ +
BH
@ @ @ @ OB
@ @ ∗
@ @ BW L H Fig. 6.
An example heuristic
Fig. 7.
The pieces we will consider for packing
C. A packing example This section works through an example of how the heuristic chooses a piece from those which remain to be packed, and where to put it in the partial solution. It goes into further detail than section III-B. The heuristic we will use in this example is shown in figure 6. It consists of nodes from the GP function and terminal set shown in table I. It contains at least one of each of the functions and terminals, however this is not a requirement. A heuristic in the population could contain any subset of the nodes available. We will use the heuristic shown in figure 6 to choose a piece from those which remain to be packed (shown in figure 7) and choose where to place it in the partial solution shown in figure 8. The partial solution shows that two pieces have already been packed by the heuristic, one to the left and one to the right. We do not show this process, because it is the same as the one we will explain, and the example will be more descriptive if we show the process in the middle of the packing. There are three bins in the partial solution, labelled one to three, which are defined by the pieces already packed. The algorithm takes each piece in turn, and evaluates the tree for every possible allocation of that piece. So, first we will consider piece one from figure 7. A piece can be placed in the left or right of a bin, in either orientation, as long as it does not exceed the width of the bin. So, piece one cannot be placed into bin one, because that bin is only 15 units wide, and the piece is 20 units wide in its narrowest orientation. Piece one can be placed into bin three, but only in one orientation, the piece will be too wide if it is placed in its horizontal orientation. Bin two is wide enough to accommodate the piece in either of its orientations.
Fig. 8. The current partial solution, the heuristic will choose a piece from figure 7 to be placed in this solution
7
Fig. 9.
All of the places where piece one from figure 7 can go
Figure 9 shows these six valid allocations for piece one in the partial solution, labelled A to F. Each of these six allocations will receive a score, obtained by evaluating the tree once for each allocation. The tree will give a different score for each allocation because the GP terminal nodes will evaluate to different values depending on the features of the allocation. This process of evaluating the tree is explained here, by taking the examples of allocations A and C from figure 9. Figure 10 shows allocation A in detail. To evaluate the tree for allocation A, we will first determine the values of the terminal nodes of the tree. The ‘width’ (W) and the ‘height’ (H) terminals will take the values 20 and 50 respectively. The ‘bin width left’ (BWL) terminal will evaluate to 35 because that is the horizontal space left in the bin after the piece is put in. The ‘bin height’ (BH) terminal evaluates to zero, because the base of bin two is at the foot of the sheet. The opposite bin for the allocation is bin three, as it is to the right of bin two, and the allocation is in the bottom left corner of bin two. The ‘opposite bin’ (OB) terminal will evaluate to −45, because the opposite bin has a height of 45, and OB is the height of the current bin minus the height of the opposite bin. The neighbouring bin is bin one, because it is to the left of bin two, and the allocation is also in the left of bin two. The ‘neighbouring bin’ (NB) terminal will evaluate to −80, because the neighbouring bin has a height of 80. Expression 1 shows the tree written in linear form. If we substitute the terminal values into the expression, we get expression 2. This simplifies to expression 3, which evaluates to −44.454 to three decimal places. This value is the score for the allocation of piece one in bin two, in a vertical orientation and on the left side of the bin. W ∗H − (BH + OB) (1) N B − (BW L ∗ H)
Fig. 10.
Allocation A from figure 9 in detail
20 ∗ 50 −80 − (35 ∗ 50) −
Fig. 11.
Allocation C from figure 9 in detail
1000 1830
− (0 − 45)
(2)
+ 45
(3)
Figure 11 shows allocation C in detail. Again, we will calculate the values of the terminal nodes for this allocation in order to evaluate the tree. W and H are now 50 and 20 respectively, they are different from their values in allocation A because the piece is now in the horizontal orientation. BWL evaluates to five, as this is the horizontal space left in the bin after the piece has been placed, shown in figure 11. BH evaluates to zero, as before, because the allocation concerns the same bin. OB and NB evaluate to −80 and −45 respectively, as the allocation is for the bottom right side of the bin rather than the left. When the terminal values have been substituted in, the tree simplifies to expression 4, which evaluates to 73.103 to three decimal places. This is the score for the allocation of piece one in bin two, in a horizontal orientation and on the right
8
Fig. 12.
Both of the potential allocations for piece two
Fig. 13. The new partial solution after the allocation which received the highest score has been performed
TABLE II I NITIALISATION PARAMETERS OF EACH GENETIC PROGRAMMING RUN Population size Maximum generations Crossover probability Mutation probability Reproduction probability Tree initialisation method Selection method
side of the bin.
−
of the process of raising the narrowest bin, shown in section III-A and figures 2 and 3. The process of choosing a piece and where to put it is now complete, and the next iteration begins. All the remaining pieces are scored again in the same way, and there will be new positions available due to the change in bin structure that has occurred.
64, 128 and 256 50 0.85 0.1 0.05 Ramped half-and-half Tournament selection, size 7
1000 145
D. How the Heuristic is Evolved
+ 80
(4)
Of these two allocations we have shown, the second allocation has been rated as better by the heuristic because it received a higher score. The other four allocations for this piece are scored in the same way. Then the allocations possible for piece two are scored, of which there are essentially two, shown in figure 12. There are in fact four allocations which are scored for this piece, but it has identical width and height so both orientations will produce the same result from the heuristic. The rest of the pieces that remain to be packed have all of their allocations scored in the same way. Finally, the allocation which received the highest score from the heuristic is actually performed. In other words the piece from the allocation is committed to the partial solution in the orientation and position in the bin dictated by the allocation. Then the bin structure is updated according to the rules explained in section III-A because a new piece has been put into a bin. For example, the allocation involving piece one in position ‘C’ from figure 9 received a score of 73.103. If no subsequent allocation (involving the same piece or any other piece) received a higher score than this, then this will be the allocation that is performed. Assuming there are no further pieces to be packed that have a width smaller than five, the bins will then be reconfigured as shown in figure 13, because
The hyper-heuristic GP system creates a random initial population of heuristics from the function and terminal set. The individual’s fitness is the height of the solution that it creates when the algorithm in fig 5 is run. Table II shows the GP initialisation parameters. During the tournament selection, if two heuristics obtain the same height, and therefore have the same fitness, the winner will be the heuristic which results in the least waste between the pieces in the solution, not counting the free space at the top of the sheet. Therefore, there is selection pressure on the individuals to produce solutions where the pieces sit next to each other neatly without any gaps. The individuals are manipulated using the parameters shown in table II. The mutation operator is point mutation, using the ‘grow’ method explained in [7], with a minimum and maximum depth of 5, and the crossover operator produces two new individuals with a maximum depth of 17. We perform three runs for each problem instance, resulting in three heuristics. For each of the runs we vary the population size, as seen in table II in order to investigate the impact that the population size makes on the quality of the resulting heuristic. IV. B ENCHMARK P ROBLEMS We use 46 benchmark instances from the literature to test our hyper-heuristic methodology. The instances used
9
TABLE III T HE B ENCHMARK I NSTANCES U SED IN T HIS PAPER
bottom-left-fill decoder, and a simulated annealing approach with bottom-left-fill decoder. Table IV shows only the best Instance set Number of Number of Sheet Optimal result of the two on each instance. These two metaheurisname Instances Rectangles Width Height tic approaches are also described in [5], and the results Hopper and evaluated using a density measure rather than the length Turton (2001) 21 16-197 20-160 20-240 Valenzuela and of sheet measure used in this paper. To obtain the results Wang (2001) 12 25-1000 100 100 that we compare with here, the metaheuristic methods were Burke, Kendall and implemented again in [33]. Whitwell (2006) 12 10-500 40-100 40-300 Ramesh Babu and The ‘best-fit’ style algorithm described in [33], which Ramesh Babu (1999) 1 50 1000 375 has achieved superior results to BL and BLF, is used as a constructive, non metaheuristic, benchmark, and we also compare with the reactive GRASP presented in [44]. The are summarised in table III. All of the instances were GRASP method does not allow piece rotations, while they are created from known optimal packings. The Hopper and allowed for the heuristics evolved here. The GRASP results Turton dataset contains 7 classes of 3 problems each, and would probably not be worse if rotations were allowed, so each class was constructed from a different sized initial while we are aware of the difference, we believe the comparrectangle and contains a different number of pieces. All ison with GRASP is still valuable, as it is a comparison with pieces have a maximum aspect ratio of 7. Valenzuela a complex human designed heuristic with many parameters. and Wang created two classes of problem, referred to Computational times for the genetic programming runs are as ‘nice’ and ‘path’. The nice dataset contains pieces shown in the far right column, the figure reported is for the of similar size, and the path dataset contains pieces that longest of the three runs, which in most cases was the run have very different dimensions. The dataset from Burke, with the population set to 256. The run times of the GP Kendall and Whitwell contains 12 instances with increasing are relatively large compared to the run times reported for numbers of rectangles in each. These datasets can be found at the methods that we compare against here. It is important to http://dip.sun.ac.za/˜vuuren/repositories/levelpaper/spp[1].htm. keep in mind that this run time includes the time taken for the We also use an instance created by Ramesh Babu and genetic programming to both design and run a heuristic. The Ramesh Babu, containing 50 rectangles all of similar size. times stated in the literature for the GRASP, metaheuristics, The dimensions of the pieces in this instance are given and the best-fit heuristic, only include the time taken to run in [38]. them. We assert that a fair comparison would involve the The Valenzuela and Wang dataset uses floating points time taken for the human designers to build as well as run to represent the dimensions of the rectangles. Our imple- their heuristics. However, the contribution of this paper is mentation uses integers, so to obtain a dataset we can use, not to show that this methodology is quicker, but to show we multiplied the data by 100 and rounded to the nearest that the heuristic design process can be automated, a result integer, the results are then divided by 100 so they can which has not been reported before for this problem. be compared to the other results in the literature. For each instance modified in this way, the total area of all the shapes divided by the width of the sheet is never less than 100, B. Comparison with Metaheuristics and Best-Fit so we can say that this procedure never results in an easier Both of the metaheuristics were unable to find a solution instance, and it is fair to compare the results with others in to the NiceP6 and PathP6 problems due to excessive time the literature. requirements. In table IV, we highlight, in bold type, the results where our system finds a heuristic which equals or beats the metaheuristic approaches and the best-fit style V. R ESULTS AND D ISCUSSION This section is arranged into three subsections. Sec- algorithm from [33] for that instance. The table shows that in the vast majority of cases, the GP tion V-A explains the benchmark algorithms with which we system finds a heuristic in at least one of the three runs that compare the evolved heuristics. Section V-B presents the obtains packings better than or equal to the two established results of comparisons with known metaheuristic and conmetaheuristics and the best-fit algorithm. However, there are structive heuristic approaches. Finally, section V-C presents four exceptions. For the instance c1p1, the GP system cannot the results of a comparison of our evolved heuristics with find a heuristic that performs better than either the GA+BLF a complex human-designed heuristic with constructive and or the SA+BLF, both of which find the optimal solution iterative phases. at a height of 20. It is known in the literature that the metaheuristic approaches perform well on problems of small A. Benchmarks size because the search space is small. Also, no heuristic is Table IV shows the overall results we have obtained for found that is better than the best-fit heuristic on the three the system, with a different random seed at the beginning most difficult ‘Path’ problems from [40]. of every run. We compare our results to two metaheuristic We conclude that the heuristics designed automatically by methods described in section II-C3, a genetic algorithm with the GP perform as well, and in many cases better than,
10
TABLE IV R ESULTS OF OUR EVOLVED HEURISTICS COMPARED TO RECENT METAHEURISTIC AND CONSTRUCTIVE HEURISTIC APPROACHES , AND THE STATE OF THE ART REACTIVE GRASP APPROACH
Name N1 N2 N3 N4 N5 N6 N7 N8 N9 N10 N11 N12 c1p1 c1p2 c1p3 c2p1 c2p2 c2p3 c3p1 c3p2 c3p3 c4p1 c4p2 c4p3 c5p1 c5p2 c5p3 c6p1 c6p2 c6p3 c7p1 c7p2 c7p3 NiceP1 NiceP2 NiceP3 NiceP4 NiceP5 NiceP6 PathP1 PathP2 PathP3 PathP4 PathP5 PathP6 RBP1
Number of Pieces 10 20 30 40 50 60 70 80 100 200 300 500 16 17 16 25 25 25 28 29 28 49 49 49 73 73 73 97 97 97 196 197 196 25 50 100 200 500 1000 25 50 100 200 500 1000 50
Optimal Height 40 50 50 80 100 100 100 80 150 150 150 300 20 20 20 15 15 15 30 30 30 60 60 60 90 90 90 120 120 120 240 240 240 100 100 100 100 100 100 100 100 100 100 100 100 375
Metaheuristic 40 51 52 83 106 103 106 85 155 154 155 312 20 21 20 16 16 16 32 32 32 64 63 62 94 95 95 127 126 126 255 251 254 108.2 112 113 113.2 111.9 106.7 107 109 108.8 111.11 400
BF Heuristic 45 53 52 83 105 103 107 84 152 152 152 306 21 22 24 16 16 16 32 34 33 63 62 62 93 92 93 123 122 124 247 244 245 107.4 108.5 107 105.3 103.5 103.7 110.1 113.8 107.3 104.1 103.7 102.8 400
these recent human designed heuristic and metaheuristic approaches. C. Comparison with the reactive GRASP The evolved heuristics’ results are competitive with the reactive GRASP method explained in section II-C3, and on the majority of problems, the reactive GRASP obtains results which are the same or better than the evolved heuristics. Table V presents the mean and standard deviation of the differences between the reactive GRASP results and the best evolved heuristic for each problem. The table shows the mean and standard deviation of the differences both as percentages and actual units, and it shows these calculations on all the results and on each individual benchmark set. Table V
Evolved 64 40 52 52 83 103 102 102 82 154 152 152 305 21 21 21 16 16 16 31 32 34 64 61 61 93 93 91 122 132 122 245 243 246 106.73 107.17 104.84 120.85 110 103.1 110.61 108.98 126.21 109.27 114.05 126.5 400
Evolved 128 40 51 51 81 103 102 102 83 151 152 152 311 21 21 20 16 16 15 31 32 32 63 62 62 91 91 93 123 122 123 245 246 243 107.05 108.76 110.5 118.82 102.6 101.69 112.86 108.73 113.08 106.54 121.97 118.81 400
Evolved 256 40 52 51 82 105 101 102 82 152 152 151 304 21 21 20 16 16 16 31 32 34 62 62 62 92 92 91 121 121 122 245 243 243 106.56 107.97 105.58 104.18 101.97 101.95 108.3 104.53 106.87 104.95 106.18 115.45 400
Reactive GRASP 40 50 51 81 102 101 101 81 151 151 151 303 20 20 20 15 15 15 30 31 30 61 61 61 91 91 91 121 121 121 244 242 243 103.7 104.6 104 103.6 102.2 102.2 104.2 101.8 102.6 102 103.1 102.5 375
Max Time to Evolve (s) 0.141 5.27 ×101 2.06 ×101 3.96 ×102 1.80 ×102 6.21 ×102 1.19 ×103 1.52 ×103 1.67 ×103 9.48 ×103 1.19 ×104 7.55 ×104 2.08 ×101 3.38 ×101 7.38 6.73 ×101 1.60 ×101 1.63 ×101 5.08 ×101 1.75 ×102 4.42 ×101 2.34 ×102 2.78 ×102 1.25 ×102 1.24 ×103 7.21 ×102 1.48 ×103 2.65 ×103 2.95 ×103 1.72 ×103 2.32 ×103 4.83 ×103 2.65 ×104 1.41 ×102 9.67 ×102 1.40 ×103 1.05 ×104 8.13 ×104 2.81 ×105 1.03 ×102 7.37 ×102 2.37 ×103 6.69 ×103 1.34 ×105 6.90 ×105 5.40 ×102
TABLE V D IFFERENCE IN PERFORMANCE OF THE BEST EVOLVED HEURISTICS FOR EACH INSTANCE AND THE REACTIVE GRASP, THE FIGURES REPRESENT HOW MUCH GREATER THE EVOLVED HEURISTICS ’ RESULTS ARE Instance set name All Instance Sets Burke, Kendall and Whitwell (2006) Hopper and Turton (2001) Valenzuela and Wang (2001)
Percent Mean S.D.
Units Mean S.D.
1.91
2.69
1.72
4.09
0.52
0.66
0.50
0.52
1.90
2.58
0.57
0.60
2.93
3.42
3.02
3.51
11
Fig. 14. Packing of instance c5p3 to a height of 91. Packed by the heuristic evolved with population 64
Fig. 15. Packing of instance c5p3 to a height of 91. Packed by the heuristic evolved with population 256
(- (+ (- (+ (* NBH (- W W)) (- (- (+ (* NBH (* (- W W) (- W W))) (- (- (* (+ BH BWL) (- W W)) NBH) (* BH BH))) (- (+ (% H OBH) (+ W NBH)) (- (* W W) (* BH BH)))) (+ (* BH BH) (- (+ BH BWL) OBH)))) (- (+ (* BH BH) (- (+ BH BWL) OBH)) (- W (* BH BH)))) (- (* (+ BH BWL) (- W W)) (+ BH BWL))) (- (+ (+ W BH) (+ W NBH)) (- (* W W) (* W W))))
(- (- (- (- (- (- (+ OBH W) (- BH W)) (- BH W)) (% (+ NBH OBH) (+ NBH BH))) (% (+ NBH OBH) (% BWL BWL))) (% (+ OBH W) (% ((- (- (- (+ OBH W) (- BH W)) (- (% (+ NBH OBH) (- (- (- (- BH (- NBH BWL)) (- W W)) (% (% (+ NBH OBH) (% BWL BWL)) (+ NBH BH))) BH)) W)) (- BH (- NBH BWL))) (- BH W)) NBH))) BH)
Fig. 16.
Fig. 17.
Heuristic that packs fig 14
therefore summarises the observation that on both the Hopper and Turton, and Burke, Kendall and Whitwell datasets, the best evolved heuristics for those problems match the reactive GRASP or are one unit worse, apart from the problem c3p3, where the best evolved heuristic is two units worse. However, on each of the problems NiceP5 and NiceP6, at least one heuristic is generated which beats the result of the reactive GRASP, which is a significant result on two relatively large problems where the evolved heuristic is only constructive, with no post-processing such as an iterative improvement stage. D. Example packings Figures 14 and 15 show two packings achieved for the problem instance c5p3, both of which have a height of 91,
Heuristic that packs fig 15
just 1 unit over the optimal packing. Figures 16 and 17 show the two evolved heuristics that created these packings, where W and H are the piece width and piece height respectively. OBH and NBH are the opposite bin height and the neighbouring bin height. BWL is the bin width left, and BH is the bin height. All of these GP terminals are summarised in table I. The heuristics have a tree structure like that of figure 6, but could not be displayed in this way because of space constraints. These two heuristics are not easily understood and likely contain redundancy as is normal in the output of a GP run. They are offered not as an explanation of why the pieces were packed in this way, but as examples of the output of our hyper-heuristic. They illustrate that the output heuristics are complex, and take a different form to that which a human programmer would write.
12
Figure 16 was evolved when the population was set to 64, and figure 17 was evolved when the population was set to 256. Figure 16 is a heuristic that packs the pieces of this instance leaving no gaps at all, as figure 14 shows. Interestingly, figure 17 is a heuristic that decides to pack one of the smallest pieces near to the beginning, seen in the bottom left corner of figure 15. Even though this may seem counter-intuitive to humans, it is a strategy that pays off in the long run for this problem instance.
still performs well on new problems with similar piece sizes. We will also investigate how large the training set needs to be in order for the heuristic to successfully generalise. It is important to note that once evolved, the heuristics can be applied very quickly to an instance because they are constructive, and involve no iteration. Extending the work in this way would potentially justify the long run times in the evolution phase, because the heuristics could then be applied to unseen instances very quickly.
VI. C ONCLUSIONS This paper has shown that an evolutionary hyper-heuristic approach can generate good quality disposable heuristics for the 2D strip packing problem. For a given instance, the hyper-heuristic evolves a good heuristic for deciding where to place each piece in the current solution. The results obtained by the evolved heuristics were mostly better than both established human designed metaheuristics and a recently published heuristic approach shown to be superior to the well documented heuristic approaches of BL and BLF. In the cases where the results were not better, the evolved heuristics resulted in at most only 5% more waste. To automatically generate heuristics which are competitive with these state of the art approaches would be a major scientific contribution. However, the constructive heuristics evolved here are better than the human designed ‘bestfit’ algorithm, the state of the art constructive heuristic for this problem. They are also better than the state of the art metaheuristic approaches circa 2001, presented in [5]. The genetic programming hyper-heuristic operates at a higher level of abstraction to previous metaheuristic approaches, by operating on a space of heuristics and not directly on a space of solutions. This means the hyper-heuristic can automatically generate a heuristic which performs well on a given problem instance, allowing good results to be obtained on a wide variety of instances without changing any parameters of the system. All that is required is to input a different problem instance.
ACKNOWLEDGMENT This work was supported by ESPRC grant reference EP/C523385/1.
VII. F UTURE W ORK Evolutionary algorithms which evolve candidate solutions to a problem instance will output a solution only applicable to that instance. In contrast, the heuristics evolved by the hyper-heuristic methodology shown here can be applied to other problem instances. They will produce a legal solution on an unseen instance because they are methods to solve a problem, they are not solutions. Therefore, to extend this work, we wish to thoroughly test the evolved heuristics on new problem instances, as opposed to treating them as ‘disposable’ heuristics only applicable to the problem instance they are evolved for. We have carried out similar work on evolving re-usable heuristics for the one dimensional bin packing problem in [45], and we will extend this work to see if this is possible in the two dimensional case. We will apply the evolved heuristics to new instances of varying similarity to the training instance, and will investigate whether a heuristic evolved on one type of problem
R EFERENCES [1] M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to the theory of NP-Completeness. San Fransisco: W.H. Freeman and Company, 1979. [2] E. K. Burke, R. S. R. Hellier, G. Kendall, and W. G., “A new bottomleft-fill heuristic algorithm for the two-dimensional irregular packing problem,” Operations Research, vol. 54, no. 3, pp. 587–601, 2006. [3] G. W¨ascher, H. Haußner, and H. Schumann, “An improved typology of cutting and packing problems,” European Journal of Operational Research, vol. 183, no. 3, pp. 1109–1130, 2007. [4] P. Ross, “Hyper-heuristics,” in Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques, E. K. Burke and G. Kendall, Eds. Boston: Kluwer, 2005, pp. 529–556. [5] E. Hopper and B. C. H. Turton, “An empirical investigation of metaheuristic and heuristic algorithms for a 2d packing problem,” European Journal of Operational Research, vol. 128, no. 1, pp. 34–57, 2001. [6] S. Jakobs, “On genetic algorithms for the packing of polygons,” European Journal of Operational Research, vol. 88, no. 1, pp. 165– 181, 1996. [7] J. R. Koza, Genetic programming: on the programming of computers by means of natural selection. Boston, Massachusetts: The MIT Press, 1992. [8] J. R. Koza and R. Poli, “Genetic programming,” in Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques, E. K. Burke and G. Kendall, Eds. Boston: Kluwer, 2005, pp. 127–164. [9] W. Banzhaf, P. Nordin, R. Keller, and F. Francone, Genetic programming, an introduction: on the automatic evolution of computer programs and its applications. San Francisco: Morgan Kaufmann, 1998. [10] E. K. Burke, E. Hart, G. Kendall, J. Newall, P. Ross, and S. Schulenburg, “Hyper-heuristics: An emerging direction in modern search technology,” in Handbook of Meta-Heuristics, F. Glover and G. Kochenberger, Eds. Boston, Massachusetts: Kluwer, 2003, pp. 457–474. [11] P. Ross, J. G. Marin-Blazquez, S. Schulenburg, and E. Hart, “Learning a procedure that can solve hard bin-packing problems: A new gabased approach to hyperheurstics,” in Proceedings of the Genetic and Evolutionary Computation Conference 2003 (GECCO ’03), Chicago, Illinois, 2003, pp. 1295–1306. [12] P. Ross, S. Schulenburg, J. G. Marin-Blazquez, and E. Hart, “Hyper heuristics: learning to combine simple heuristics in bin packing problems,” in Proceedings of the Genetic and Evolutionary Computation Conference 2002 (GECCO ’02), New York, NY., 2002. [13] L. Han and G. Kendall, “Investigation of a tabu assisted hyper-heuristic genetic algorithm,” in Proceedings of Congress on Evolutionary Computation (CEC’03), vol. 3, Canberra, Australia, December 2003, pp. 2230–2237. [14] E. K. Burke, S. Petrovic, and R. Qu, “Case-based heuristic selection for timetabling problems,” J. of Scheduling, vol. 9, no. 2, pp. 115–132, 2006. [15] K. Dowsland, E. Soubeiga, and E. K. Burke, “A simulated annealing hyper-heuristic for determining shipper sizes,” European Journal of Operational Research, vol. 179, no. 3, pp. 759–774, 2007. [16] E. K. Burke, G. Kendall, and E. Soubeiga, “A tabu-search hyperheuristic for timetabling and rostering,” Journal of Heuristics, vol. 9, no. 6, pp. 451–470, 2003.
13
[17] E. K. Burke, B. McCollum, A. Meisels, S. Petrovic, and R. Qu, “A graph-based hyper heuristic for timetabling problems,” European Journal of Operational Research, vol. 176, pp. 177–192, 2007. ¨ [18] E. Ozcan, B. Bilgin, and E. E. Korkmaz, “Hill climbers and mutational heuristics in hyperheuristics,” in LNCS 4193, Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN 2006), T. Runarsson, H.-G. Beyer, E. Burke, J. J.MereloGuervos, D. Whitley, and X. Yao, Eds., Reykjavik, Iceland, September 2006, pp. 202–211. [19] P. Cowling, G. Kendall, and E. Soubeiga, “A hyperheuristic approach to scheduling a sales summit,” in Proceedings of the 3rd International Conference on the Practice and Theory of Automated Timetabling (PATAT 2000), E. K. Burke and W. Erben, Eds., Konstanz, Germany, August 2000, pp. 176–190. [20] P. Rattadilok, A. Gaw, and R. Kwan, “Distributed choice function hyper-heuristics for timetabling and scheduling,” in Practice and Theory of Automated Timetabling V, Springer Lecture notes in Computer Science, E. Burke and M.Trick, Eds., vol. 3616, 2005, pp. 51–67. [21] A. S. Fukunaga, “Automated discovery of composite sat variableselection heuristics,” in Eighteenth national conference on Artificial intelligence. Menlo Park, CA, USA: American Association for Artificial Intelligence, 2002, pp. 641–648. [22] A. Fukunaga, “Evolving local search heuristics for SAT using genetic programming,” in Proceedings of the Genetic and Evolutionary Computation Conference 2004 (GECCO ’04), K. Deb, R. Poli, W. Banzhaf, H.-G. Beyer, E. Burke, P. Darwen, D. Dasgupta, D. Floreano, J. Foster, M. Harman, O. Holland, P. L. Lanzi, L. Spector, A. Tettamanzi, D. Thierens, and A. Tyrrell, Eds., vol. 3103. Seattle, WA, USA: Springer-Verlag, 2004, pp. 483–494. [23] A. S. Fukunaga, “Automated discovery of local search heuristics for satisfiability testing,” Evolutionary Computation (MIT Press), vol. 16, no. 1, pp. 31–1, 2008. [24] E. K. Burke, M. R. Hyde, and G. Kendall, “Evolving bin packing heuristics with genetic programming,” in LNCS 4193, Proceedings of the 9th International Conference on Parallel Problem Solving from Nature (PPSN 2006), T. Runarsson, H.-G. Beyer, E. Burke, J. J.Merelo-Guervos, D. Whitley, and X. Yao, Eds., Reykjavik, Iceland, September 2006, pp. 860–869. [25] P. Gilmore and R. Gomory, “A linear programming approach to the cutting-stock problem,” Operations Research, vol. 9, no. 6, pp. 849– 859, 1961. [26] N. Christofides and C. Whitlock, “An algorithm for two-dimensional cutting problems,” Operations Research, vol. 25, no. 1, pp. 30–44, 1977. [27] J. E. Beasley, “An exact two-dimensional non-guillotine cutting tree search procedure,” Operations Research, vol. 33, no. 1, pp. 49–64, 1985. [28] M. Hifi and V. Zissimopoulos, “A recursive exact algorithm for weighted two-dimensional cutting,” European Journal of Operational Research, vol. 91, no. 3, pp. 553–564, 1996. [29] V. D. Cung, M. Hifi, and B. Le Cun, “Constrained two-dimensional cutting stock problemsa best-first branch-and-bound algorithm,” International Transactions in Operational Research, vol. 7, no. 3, pp. 185–210, 2000. [30] B. S. Baker, E. G. C. Jr., and R. L. Rivest, “Orthogonal packings in two dimensions.” SIAM J. Comput., vol. 9, no. 4, pp. 846–855, 1980. [31] B. Chazelle, “The bottom-left bin packing heuristic: An efficient implementation,” IEEE Transactions on Computers, vol. 32, no. 8, pp. 697–707, 1983. [32] D. J. Brown, B. S. Baker, and H. P. Katseff, “Lower bounds for online two-dimensional packing algorithms,” Acta Informatica, vol. 18, no. 2, p. 1982, 1982. [33] E. Burke, G. Kendall, and G. Whitwell, “A new placement heuristic for the orthogonal stock-cutting problem,” Operations Research, vol. 55, no. 4, pp. 655–671, 2004. [34] D. Zhang, Y. Kang, and A. Deng, “A new heuristic recursive algorithm for the strip rectangular packing problem,” Computers and Operations Research, vol. 33, no. 8, pp. 2209–2217, 2006. [35] F. Rinaldi and A. Franz, “A two-dimensional strip cutting problem with sequencing constraint,” European Journal of Operational Research, vol. In Press, Corrected Proof, Available online 7 July 2006, 2006. [36] D. Liu and H. Teng, “An improved bl-algorithm for genetic algorithms of the orthogonal packing of rectangles,” European Journal of Operational Research, vol. 112, no. 2, pp. 413–419, 1999.
[37] G. Belov, G. Scheithauer, and E. A. Mukhacheva, “One-dimensional heuristics adapted for two-dimensional rectangular strip packing,” Journal of the Operational Research Society, advance online publication 4 April 2007. [38] A. Ramesh Babu and N. Ramesh Babu, “Effective nesting of rectangular parts in multiple rectangular sheets using genetic and heuristic algorithms,” International Journal of Production Research, vol. 37, no. 7, pp. 1625–1643, 1999. [39] E. Burke, G. Kendall, and G. Whitwell, “A simulated annealing enhancement of the best-fit heuristic for the orthogonal stock cutting problem,” INFORMS Journal On Computing (accepted), to appear 2009. [40] C. L. Valenzuela and P. Y. Wang, “Heuristics for large strip packing problems with guillotine patterns: An empirical study,” in Proceedings of the Metaheuristics International Conference 2001 (MIC’01), University of Porto, Porto, Portugal, 2001, pp. 417–421. [41] K. K. Lai and J. W. M. Chan, “Developing a simulated annealing algorithm for the cutting stock problem,” Computers and Industrial Engineering, vol. 32, no. 1, pp. 115–127, 1997. [42] L. Faina, “An application of simulated annealing to the cutting stock problem,” European Journal of Operational Research, vol. 114, no. 3, pp. 542–556, 1999. [43] A. Bortfeldt, “A genetic algorithm for the two-dimensional strip packing problem with rectangular pieces,” European Journal of Operational Research, vol. 172, no. 3, pp. 814–837, 2006. [44] R. Alvarez-Valdes, F. Parreno, and J. M. Tamarit, “Reactive grasp for the strip-packing problem,” Computers and Operations Research, vol. 35, no. 4, pp. 1065–1083, 2008. [45] E. K. Burke, M. R. Hyde, G. Kendall, and J. Woodward, “Automatic heuristic generation with genetic programming: Evolving a jack-ofall-trades or a master of one,” in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2007), London, UK., July 2007, pp. 1559–1565.