Document not found! Please try again

Mixed-Integer Evolution Strategies and Their Application to ...

3 downloads 751 Views 374KB Size Report
Abstract. This paper discusses Mixed-Integer Evolution Strategies and their application to an automatic image analysis system for IntraVascular UltraSound ...
Mixed-Integer Evolution Strategies and Their Application to Intravascular Ultrasound Image Analysis Rui Li1 , Michael T.M. Emmerich1 , Ernst G.P. Bovenkamp2 , Jeroen Eggermont2, Thomas B¨ack1 , Jouke Dijkstra2 , and Johan H.C. Reiber2 1 Natural Computing Group, Leiden University, P.O. Box 9500, Leiden 2300 CA, The Netherlands {ruili, emmerich, baeck}@liacs.nl 2 Division of Image Processing, Department of Radiology C2S, Leiden University Medical Center, P.O. Box 9600, Leiden 2300 RC, The Netherlands {E.G.P.Bovenkamp, J.Eggermont, J.Dijkstra, J.H.C.Reiber}@lumc.nl

Abstract. This paper discusses Mixed-Integer Evolution Strategies and their application to an automatic image analysis system for IntraVascular UltraSound (IVUS) images. Mixed-Integer Evolution Strategies can optimize different types of decision variables, including continuous, nominal discrete, and ordinal discrete values. The algorithm is first applied to a set of test problems with scalable ruggedness and dimensionality. The algorithm is then applied to the optimization of an IVUS image analysis system. The performance of this system depends on a large number of parameters that – so far – need to be chosen manually by a human expert. It will be shown that a mixed-integer evolution strategy algorithm can significantly improve these parameters compared to the manual settings by the human expert.

1

Introduction

Many optimization problems are in practice difficult to solve with standard numerical methods (like gradient-based strategies), as they incorporate different types of discrete parameters, and confront the algorithms with a complex geometry (rugged surfaces, discontinuities). Moreover, the high dimensionality of these problems makes it almost impossible to find optimal settings through manual experimentation. Therefore, robust automatic optimization strategies are needed to tackle such problems. In this paper we present an algorithm, namely the Mixed-Evolution Strategy (MI-ES), that can handle difficult mixed-integer parameter optimization problems. In particular, we aim to optimize feature detectors in the field of Intravascular Ultrasound (IVUS) image analysis. IVUS images show the inside of coronary or other arteries and are acquired with an ultrasound catheter positioned inside the vessel. An example of an IVUS F. Rothlauf et al. (Eds.): EvoWorkshops 2006, LNCS 3907, pp. 415–426, 2006. c Springer-Verlag Berlin Heidelberg 2006 

416

R. Li et al.

image with several detected features can be seen in Figure 1. IVUS images are difficult to interpret which causes manual segmentation to be highly sensitive to intra- and inter-observer variability[5]. In addition, manual segmentation of the large number of IVUS images per patient is very time consuming. Therefore an automatic system is needed. However, feature detectors consist of large numbers of parameters that are hard to optimize manually and may differ for different interpretation contexts. Moreover, these parameters are subject to change when something changes in the image acquisition process.

Fig. 1. An IntraVascular UltraSound (IVUS) image with detected features. The black circle in the middle is where the ultrasound imaging device (catheter) was located. The dark area surrounding the catheter is called the lumen, which is the part of the artery where the blood flows. Above the catheter calcified plague is detected which blocks the ultrasound signal causing a dark shadow. Between the inside border of the vessel and the lumen there is some soft plague, which does not block the ultrasound signal. The dark area left of the catheter is a sidebranch.

To optimize these IVUS feature detectors, we propose to use mixed-integer evolution strategies, which have so far been successfully applied to simulatorbased optimization of chemical engineering plants [2]. Before applying these strategies on the application problem, their performance and reliability will first be tried on a set of test functions, including a new integer test function with scalable degree of ruggedness. This paper is structured as follows: first, in Section 2, we will explain the application problem in more detail. Then, in Section 3, mixed-integer evolution strategies (MI-ES) will be introduced. These strategies are tried on artificial test problems in Section 4. Promising variants of the MI-ES are then applied to the parameter optimization of an IVUS feature detector in Section 5. Finally, conclusions and future work are presented in Section 6.

Mixed-Integer Evolution Strategies and Their Application

2

417

Intravascular Ultrasound Image Analysis

IntraVascular UltraSound (IVUS) is a technique used to get real-time high resolution tomographic images from the inside of coronary vessels and other arteries. To gain insight into the status of an arterial segment a so-called catheter pullback sequence is carried out. A catheter ( ±1mm) with a miniaturized ultrasound transducer at the tip is inserted into a patient’s artery and positioned downstream of the segment of interest. The catheter is then pulled back in a controlled manner, using motorized pullback (1 mm/s), during which images are acquired continuously. In [1] a state-of-the-art multi-agent system is used to detect lumen, vessel, shadows, sidebranches and calcified plagues in IVUS images. The system, as shown in Figure 2, is based on the cognitive architecture Soar (States, operators and results) [6]. IVUS image processing agents interact with other agents through communication, act on the world by controlling and adapting image processing operations and perceive that same world by accessing image processing results. Agents thereby dynamically adapt the parameters of low-level image segmentation algorithms based on knowledge of global constraints, contextual knowledge, local image information and personal beliefs. The lumen-agent, for example, encodes and controls an image processing pipeline which includes binary morphological operations, an ellipse-fitter and a dynamic programming module, and it determines all relevant parameters. Generally, agent control allows the underlying segmentation algorithms to be simpler and to be applied to a wider range of problems with a higher reliability.

Agent Platform (Soar)

lumen agent

vessel agent Interaction Interaction

shadow agent Interaction

Perception

Image Proc. Results

calcified plaque agent

sidebranch agent

Action

Image Processing Platform

Input Images

Fig. 2. Global view of the multi-agent system architecture

Fig. 3. Schematic version of Figure 1 as detected by the multi-agent image segmentation system

Although the multi-agent system has shown to offer lumen and vessel detection comparable to human experts [1], it is designed for symbolic reasoning, not numerical optimization. Further it is almost impossible for a human expert to

418

R. Li et al.

completely specify how an agent should adjust its feature detection parameters in each and every possible interpretation context. As a result an agent has only control knowledge for a limited number of contexts and a limited set of feature detector parameters. In addition, this knowledge has to be updated whenever something changes in the image acquisition pipeline. Therefore, it would be much better if such knowledge might be acquired by learning the optimal parameters for different interpretation contexts automatically.

3

Mixed-Integer Evolution Strategies

From the point of view of numerical analysis, the problem described above can be classified as a black box parameter optimization problem. The evaluation software is represented as an objective function f : S → R to be minimized, whereby S defines a parametric search space from which the decision variables can be drawn. Typically, also constraint functions are defined, the value of which has to be kept within a feasible domain. Typically, the objective function evaluation and also partly the evaluation of the constraint function evaluation are done by the evaluation software, which, from the point of view of the algorithm, serves as a black box evaluator. One of the main reasons why standard approaches for black box optimization cannot be applied to the application problem, is because different types of discrete and continuous decision variables are involved in the optimization. For the parametrization of the image analysis system three main classes of decision variables are identified: – Continuous variables: These are variables that can change gradually in arbitrarily small steps – Ordinal discrete variables: These are variables that can be changed gradually but there are smallest steps (e.g. discretized levels, integer quantities) – Nominal discrete variables: These are discrete parameters with no reasonable ordering (e.g. discrete choices from an unordered list/set of alternatives, binary decisions). Taking the above into account, the optimization task reads: f (r1 , . . . , rnr , z1 , . . . , znz , d1 , . . . , dnd ) → min

(1)

subject to: min max ri ∈ [ri , ri ] ⊂ R, i = 1, . . . , nr zi ∈ [zimin , zimax ] ⊂ Z, i = 1, . . . , nz di ∈ Di = {di,1 , . . . , di,|Di | }, i = 1, . . . , nd Here ri denotes a continuous variable, zi a integer variable, and di a discrete variable which can be drawn from a predescribed set Di .

Mixed-Integer Evolution Strategies and Their Application

3.1

419

Evolution Strategies

Mixed-integer evolution strategies (MI-ES) are a special instantiation of evolution strategies that can deal with parameter types, i.e. can tackle problems as described above. They were first proposed in Emmerich et al. [2, 3] for the purpose of chemical engineering plant optimization with process simulators from industry. There, a detailed outline of the algorithm was given with experiments on test functions of higher dimension. However, the experimental analysis of this algorithm deserves some further attention. Therefore, before presenting results of new studies below, we will first give a brief outline of the algorithm and discuss its working principles. The main loop of the MI-ES is displayed in algorithm 1.. First, the algorithm initializes µ individuals randomly (uniformly distributed within the parameter ranges), and evaluates them by means of the objective function f . Then, λ offspring individuals are created. For each new variant, two individuals are drawn out of the current population P (t) and discrete recombination (= uniform crossover) on the objective variables and intermediate recombination (averaging) on the step-size parameters is used to generate an offspring. Each of the λ offspring generated by this process is then modified by means of the mutation operator, which we will describe in more detail in section 3.2. Then the objective function value of the λ new individuals (= offspring individuals) is evaluated. Next, the selection operator chooses the µ best individuals out of the λ offspring individuals and the µ parental individuals. Note, that an alternative strategy would be to consider only the offspring population for the selection of individuals in P (t). This strategy would be termed a (µ, λ)-ES. In this paper we choose a Plus-Strategy based on preliminary experiments that indicated that this strategy behaves superior whenever only a small number of experiments could be afforded. As long as the termination criterion1 is not fulfilled, these µ selected individuals form the next population P (t + 1). The proposed algorithmic scheme will be called a (µ + λ)-ES. Algorithm 1. Main loop of a (µ + λ)- ES. t←0 initialize population P (t) of µ individuals randomly within the individual space I evaluate the µ initial individuals applying fitness function f while Termination criteria not fulfilled do recombine λ offspring individuals out of µ parents, by choosing randomly two individuals and recombine them, to obtain each of the offspring individual. mutate the λ offspring individuals evaluate the λ offspring individuals select the µ best individuals for P (t+1) from λ offspring individuals and µ parents t ← t+1 end while

1

In most cases a maximal number of generations is taken as termination criterion.

420

R. Li et al.

In order to allow for a mutative step-size adaptation, a surplus of offspring individuals needs to be generated in each generation. The recommended ratio of µ/λ  1/7 leads to a good average performance of evolution strategies in many cases [8]. 3.2

Mutation of Different Variable Types

Individuals consist of three vectors of decision variables and associated to each of these vectors, we also maintain a vector of step-size parameters. In [2] it was proposed to learn only a single step-size for each variable vector. Accordingly, in this case the vector reduces to a scalar value associated with a class of parameters. In summary, a variable vector that represents an individual reads (r1 , . . . , rnr , z1 , . . . , znz , d1 , . . . , dnd , s1 , . . . , sns , q1 , . . . , qnς , p1 , . . . , pnp ) ∈ I, I = Rnr × Znz × D1 × · · · × Dnd × (R+ )ns × (R+ )nq × [0, 1]np . In case ns = nr the step-sizes are assigned to the different vector positions and can be adapted individually. The step-size parameters for the continuous variables si represent standard-deviations used for scaling the mutation distribution of the object variables. In case ns = 1 the same step-size is used for all parameters. The step-sizes for the integer variables are defined in a similar manner. As we will discuss later, another distribution will be used to sample integer values. Finally, the step-size parameters for the nominal discrete variables are interpreted as mutation probabilities.

Algorithm 2. Mutation procedure in MI-ES for i = 1, . . . , nr do si ← si exp(τg Ng + τl N(0, 1)) ri = ri + N (0, si ) end for for i = 1, . . . , nz do qi ← qi exp(τg Ng + τl N(0, 1)) zi ← zi + G(0, qi ) end for i ∗ exp(−τl ∗ N(0, 1))] pi := 1/[1 + 1−p pi for i ∈ {1, . . . , nd } do if U(0, 1) < pi then di ← uniformly randomly value from Di end if end for

Algorithm 2 summarizes the mutation procedure. For the local and global step-size learning  rates τl and τg we use √ the recommended parameter settings √ τ = 1 and τl = 1/ 2 nr and τg = 1/ 2nr (cf. [8]).

Mixed-Integer Evolution Strategies and Their Application

421

Note, that in case a single step-size is chosen, τl = 0. In the algorithm N(0, 1) denotes a function that results in a standard normal distributed random number. Accordingly, U(0, 1) denotes a function returning a uniformly distributed number in [0, 1] ⊂ R and G(0, q) returns a geometrically distributed random value. Among these distributions, the geometric distribution deserves further attention, as it is rarely referred to in literature. Rudolph [7] proposed the geometrical distribution on integer representations. Geometrical distributed random variables are random variables whose values are in Z. They have properties similar to normal distributed random variables in R. In particular, they have infinite support in Z, are unimodal with a peak in 0, and their probability function is symmetric in the origin. Moreover, as pointed out by Rudolph [7], multivariate extensions are characterized by a rotational symmetry with regard to the 1 norm2 and they belong to a family of maximal entropy distributions. Finally, by increasing the value q the standard deviation of the random variable can be gradually increased. All these characteristics make the geometric distribution well suited for application within mixed-integer evolution strategies. A geometrically distributed random variable G can be generated from two uniformly distributed random variables u1 := U (0, 1); u2 := U (0, 1) via:   s/nz ln(1 − ui )  G = G1 − G2 , p = 1 − , Gi = , i = 1, 2 (2) ln(1 − p) 1 + 1 + ( nsz )2 The mutation of the mutation probabilities is done by means of a logistic distribution as described in [2]. To make sure that variables stay within their respective boundaryes we have added some routines for interval treatment to the MI-ES. While for the continuous variables we used reflection at the boundary, for the integer variables we set the value to the bound, whenever the bound is exceeded. The latter method is also used to keep the mutation probabilities within bounds.

4

Study on Artificial Test Problems

In order to select a favorable variant of the MI-ES for the time-consuming runs on the image analysis problem, we study the behavior of the MI-ES on a generalized sphere function and barrier problems using a new problem generator. The generalized sphere function is an extension of a standard problem [2]: fsphere (r, z, d) =

nr  i=1

ri2

+

nz  i=1

zi2

+

nd 

d2i → min

(3)

i=1

nr = nz = nd = 7, r ∈ [−10, 10]nr ⊂ Rnr , z ∈ [−10, 10]nz , d ∈ {0, . . . , 5}nd (4) This problem is relatively simple, as it is decomposable and unimodal. We use it to gain some insights of how the MI-ES behaves on rather simple problems and thus to estimate the best case behavior of the MI-ES. 2

Sum of absolute values.

422

R. Li et al. "3dplot.dat" u 1:2:3 700 600 500 400 300 200 100

800 700 600 500 400 300 200 100 0

0

2

4

6

6 8 10 4 12 14 2 16 18 20 0

8

20 18 16 14 12 10

"3dplot.dat" u 1:2:3 700 600 500 400 300 200 100

800 700 600 500 400 300 200 100 0

0

2

4

6

6 8 10 4 12 14 2 16 18 20 0

8

20 18 16 14 12 10

Fig. 4. Surface plots of the barrier function for two variables. All other variableswere kept constant at a value of zero, two integer values were varied in the range from 0 to 20.

As a more complex test case we designed the multimodal barrier problem. It produces integer optimization problems with a scalable degree of ruggedness (determined by parameter C) by generating an integer array A using the following algorithm: A[i] = i, i = 0, . . . , 20 for k ∈ {1, . . . , C} do j ← random number out of {0, . . . , 19} swap values of A[j] and A[j + 1] end for Then a barrier function is computed: fbarrier (r, z, d) = fsphere (r, d) +

nz 

A[zi ]2 → min

(5)

i=1

nr = nz = nd = 7, r ∈ [−10, 10]nr ⊂ R, z ∈ [0, 20]nz , d ∈ {0, . . . , 5}nd

(6)

The parameter C controls the ruggedness of the resulting function with regard to the integer space. High values of C result in rugged landscapes with many barriers. To get an intuition about the influence of C on the geometry of the function we included plots for a two-variable instantiation of the barrier function in Figure 4 for C = 20 and C = 100. In Figure 5 we present some parameter studies of the MI-ES. We compare different settings for the population and offspring sizes {(µ = 3, λ = 10), (µ = 4, λ = 28), (µ = 15, λ = 100)} on the sphere and barrier problems. It turns out that the (µ = 4, λ = 28) setting performs best on the more difficult problems, at least if only a limited number of about 2000 function evaluation can be afforded (as in our application problem). However, the plots on the barrier problem show that for long runs with t  2000 a strategy with a larger population size might be favorable. We also observed that it is less risky with regard to the performance of the strategy to use a single step-size per parameter type (ns = nq = np = 1), instead of individual step-sizes for all of the variables. This corresponds to the findings in [2].

Mixed-Integer Evolution Strategies and Their Application 1000

1000

15+100 4+28 3+10

100

423

15+100 4+28 3+10

Objective function value

Objective function value

10 1 0.1 0.01 0.001

100

10

0.0001 1e-05 1e-06

1 0

200

400

600

800 1000 1200 1400 1600 1800 2000 Number of evaluations

0

200

400

600

800 1000 1200 1400 Number of evaluations

1600

1800

2000

Fig. 5. Averaged results on the sphere and the barrier functions. For all results the median and the quartiles of 20 runs are displayed.

5

Experimental Results on Image Analysis Problem

To determine whether or not MI-ES can be used as an optimizer for the parameters of feature detectors in the multi-agent image analysis system we test MI-ES on the lumen feature detector. This detector is chosen, because it can produce good results in isolation without additional information about sidebranches, shadows, plagues and vessels. Table 1 contains the parameters for the lumen feature detector together with their type, range, dependencies and the default settings determined by an expert. As can be seen the parameters are a mix of continuous, nominal discrete (integer) and ordinal discrete(including boolean) variables. For the experiments we use five disjoint sets of 40 images. The fitness function used in the experiments is based on the difference (see Eq. 7)between the contour c found by the lumen feature detector and the desired lumen contour C drawn by a human expert. The difference measure is defined as the sum of the distances of the points of contour c that are more than a threshold distance away from contour C. The reason to allow for a small difference between the two contours is that even an expert will not draw the exact same contour twice in a single IVUS image. The fitness function itself is the calculated average difference over the 40 images in the dataset. Let #points denote the total number of points of contour C, then the contour difference is defined as: difference(c, C) =

#points 

d (cp , C), if d (cp , C) > threshold

(7)

p=1

On each of the 5 datasets we trained our (4 + 28) MI-ES algorithm for 100 iterations resulting in 2804 fitness evaluations. The results are displayed in Table 2. Parameter solution 1 was trained on dataset 1, parameter solution 2 was trained on dataset 2, etc . . . . Table 2 shows that for most cases the MI-ES parameter solutions result in lower average contour differences when applied to both test- and training data.

424

R. Li et al. Table 1. Parameters for the lumen feature detector

name

type range dependencies default — maxgray integer [2, 150] > mingray 35 mingray integer [1, 149] < maxgray 1 connectivity ordinal 4,6,8 6 relativeopenings boolean {false,true} true nrofcloses integer [0, 100] used if not relativeopenings 5 nrofopenings integer [0, 100] used if not relativeopenings 45 scanlinedir ordinal {0,1,2} 1 scanindexleft integer [-100, 100] < scanindexright -55 scanindexright integer [-100, 100] > scanindexleft 7 centermethod ordinal {0,1} 1 fitmodel ordinal {ellipse, circel} ellipse sigma continuous [0.5 10.0] 0.8 scantype ordinal {0,1,2} 0 sidestep integer [0, 20] 3 sidecost continuous [0.0, 100] 5 nroflines integer [32, 256] 128

Table 2. Performance of the best found MI-ES parameter solutions when trained on one of the five datasets (parameter solution i was trained on dataset i). All parameter solutions and the (default) expert parameters are applied to all datasets. Average difference (fitness) and standard deviation w.r.t. expert drawn contours are given. default parameters dataset fitness s.d. 1 395.2 86.2 2 400.2 109.2 3 344.8 65.6 4 483.1 110.6 5 444.2 90.6

parameter solution 1 fitness s.d. 148.4 39.5 183.3 59.2 205.9 69.8 284.4 92.7 368.4 100.9

parameter solution 2 fitness s.d. 159.8 43.5 180.7 58.4 203.9 70.1 269.0 73.2 370.9 102.5

parameter solution 3 fitness s.d. 185.4 43.0 207.2 69.2 164.4 49.7 250.4 80.4 462.2 377.3

parameter solution 4 fitness s.d. 144.8 42.0 232.7 71.0 183.9 80.3 173.2 64.7 168.7 64.0

parameter solution 5 fitness s.d. 271.0 74.8 352.0 73.1 327.1 55.9 330.1 82.2 171.8 54.5

Only parameter solution 3 applied to dataset 5 has a higher average contour difference (444.2 vs 462.2). To determine if the best results obtained by the MIES algorithm are also significantly better than the default parameter results, a paired two-tailed t-test was performed on the (40) difference measurements for each image dataset and each solution using a 95% confidence interval (p = 0.05). The t-test shows that all differences are significant except for the difference between parameter solution 3 applied to dataset 5 and the default solution. Therefore we conclude that the MI-ES solutions are significantly better than the default parameter solution in 96% of the cases (24 out of 25) and equal in one case. Some other interesting results shown in Table 2 are that the solution trained with dataset 4 performs better on dataset 1 than the solution which was op-

Mixed-Integer Evolution Strategies and Their Application

425

timized with dataset 1 although this difference is not significant. Dataset 4 is interesting anyway, as it is the only dataset for which the performance of all parameter solutions are mutually significantly different, while the solution trained with this dataset has the best average performs on all other datasets. Visual inspection of the results of the application of parameter solution 4 to the other datasets shows that this solution is a good approximator of the lumen contours in the other datasets, but that the particular solutions trained with those datasets follow the expert contours more closely. Perhaps dataset 4 contains features of the other datasets (1,2,3 and 5) which may explain this behavior. However, visual inspection of the the image datasets does not show any apparent differences.

6

Conclusions and Outlook

In this paper we have applied Mixed-Integer Evolution Strategies (MI-ES) to optimize the parameters of a lumen feature detector for IntraVascular Ultrasound (IVUS) images. The mixed-integer evolution strategy uses state of the art type of variation operators for mixed integer representations, where the parameters are a combination of continuous, ordinal discrete and nominal discrete variables. Different instantiations of the MI-ES are tested on two artificial test problems in order to identify a favorable parameter setting for the experiments on the application problem. Based on the results of this study, a (4 + 28)-MI-ES with a single step-size for each variable class was identified as a robust strategy and used for tackling the image analysis problem. The results show a significant improvement over the parameters tuned manually by an expert. Our first results clearly indicate the feasibility of the MI-ES approach for optimizing parameters of feature object detectors. However, our tests are at the moment restricted to the lumen feature detector. In future studies we plan to investigate the potential of the method for the other feature detectors needed for IVUS image analysis. We do not expect to find one optimal solution for each feature detector to work in all possible contexts and for all possible patients. Therefore we are going to apply the methods outlined in this paper to different image interpretation contexts which should result in a set of optimal image feature detector solutions rather than in a single solution. The aim is then to let an agent in the multi-agent system decide which particular solution to use based on its current knowledge of the situation. We also intend to further study the mixed-integer evolution strategy algorithm both in theory and practice to get more insight into its strengths and weaknesses.

Acknowledgements This research is supported by the Netherlands Organisation for Scientific Research (NWO) and the Technology Foundation STW. We would like to thank Andreas Fischbach and Anne Kunert for the experiments on the sphere and barrier problems and the anonymous reviewers for their useful comments.

426

R. Li et al.

References 1. E.G.P. Bovenkamp, J. Dijkstra, J.G. Bosch, and J.H.C. Reiber. Multi-agent segmentation of IVUS images. Pattern Recognition, 37(4):647–663, April 2004. 2. M. Emmerich, M. Sch¨ utz, B. Gross and M. Gr¨ otzner: Mixed-Integer Evolution Strategy for Chemical Plant Optimization. In I. C. Parmee, editor, Evolutionary Design and Manufacture (ACDM 2000), 2000, pp. 55-67, Springer NY 3. M. Emmerich, M. Gr¨ otzner and M. Sch¨ utz: Design of Graph-based Evolutionary Algorithms: A case study for Chemical Process Networks. Evolutionary Computation, 9(3), 2001, pp. 329 - 354, MIT Press 4. F. Hoffmeister and J. Sprave. Problem independent handling of constraints by use of metric penalty functions. In L. J. Fogel, P. J. Angeline, and Th. B¨ ack, editors, Evolutionary Programming V - Proc. Fifth Annual Conf. Evolutionary Programming (EP’96), pages 289–294. The MIT Press, 1996. 5. G. Koning, J. Dijkstra, C. von Birgelen, J.C. Tuinenburg, J. Brunette, J-C. Tardif, P.W. Oemrawsingh, C. Sieling, and S. Melsa. Advanced contour detection for threedimensional intracoronary ultrasound: validation – in vitro and in vivo. The International Journal of Cardiovascular Imaging, (18):235–248, 2002. 6. A. Newell. Unified Theories of Cognition. Number ISBN 0-674-92101-1. Harvard University Press, Cambridge, Massachusetts, 1990. 7. G. Rudolph, An Evolutionary Algorithm for Integer Programming, In Davidor et al.: Proc. PPSN III, Jerusalem, Springer, Berlin, 1994, 139–148. 8. H.-P. Schwefel: Evolution and Optimum Seeking, Sixth Generation Computing Series, John Wiley, NY, 1995

Suggest Documents