RENDERING EXPLOSIONS Fabrice Uhl and Jacques Blanc-Talon
ETCA/CREA/SP, 16 bis, av. Prieur de la C^ote d'Or, 94114 Arcueil, FRANCE
[email protected],
[email protected]
Keywords : Image synthesis, Military application, Performance analysis, Hybrid simulation, Deterministic model.
evaluation of vision algorithms by providing parameterized images at low cost and allowing repeated testing (Blanc-Talon 1997). This paper focuses on the simulation and rendering of explosions. This particular class of non-linear phenomena has been chosen since they are destructive and highly disruptive and therefore cannot be produced whenever needed. In section 1, existing methods in image synthesis are compared according to the need of mixing (solid) particles and gases. A simulation model based on \continuous particles" is proposed in section 2. In section 3, a ray-tracing algorithm speci cally designed for the model being proposed is given. Its implementation on a parallel machine (CM5) is detailed and a signi cant synthetic result is shown.
ABSTRACT
A new method for simulating and rendering explosion phenomena is presented. It is based on an improvement of the SPH (Smoothed Particle Hydrodynamics) model introduced by Monaghan. Choosing polynomial kernel functions allows to approximate the attenuation term and compute an analytical solution for the illumination integral in the ray-tracing algorithm. Dynamics uses well-known SPH evolution rules whose parameters are inspired from chemical rules and measured data. A parallel computation on a CM5 machine is performed by the message passing method using the PVM library and a space subdivision approach; the programming scheme is used both for rendering the scene and dazzle postprocessing.
1 MODELLING VS RENDERING
From a physical and optical point of view, there are three important components in an explosive phenomenon : intense and local light emission (which often saturates the picture) with solid incandescent particles ejection,
VIRTUAL EXPLOSIONS FOR TESTING REAL ALGORITHMS
Information systems are crucial in any modern military con ict. Fitted with a video input, computers are used for enhancing and processing images as well as detecting targets. After decades of technological improvements, an important question still remains, namely the reliability of image processing algorithms when used in real conditions. One way of measuring their reliability consists in running them on a great number of images, which leads to the problem of producing realistic test images. However, while some non-linear signals can be more or less easily modelled by their statistical properties, others such as explosions, smoke or lures are still hard to model. This research aims at computing physically realistic detonic images so as to set them in video sequences. The process of encrusting a real image with a synthetic sub-image has already been addressed in another work. Such an approach will help in the performance
burning gas growth with global light emission, burned gas expansion through wind eld. A model whose implementation must render a physically consistent image should take into account such dramatic changes in the geometry, the chemistry and the optics of the object to be modelled. We are mostly interested in preserving the optical realism of the explosion. Thus we have to nd a good balance between a model which is computationally tractable and another one which yields enough turbulence to look realistic, which is not simple as exempli ed in the following literature review. Three facets of the problem have to be considered, namely the modelling, the rendering and the dynamics. 1
Modelling gases and related transparent objects is not an easy task because their borders are not well de ned. Existing methods make use of a voxel grid so as to spatially discretize the data set as in a nite element simulation. However, if such methods are well adapted to real data, they have serious drawbacks such as their memory cost, the happening of aliasing eects and the loss of resolution. Transparent textures were used in (Gardner 1985) on ellipsoid for cloud synthesis. (Perlin and Hoert 1989) used a noise-base function and solid texturing, while (Sakas and Kernke 1994) performed shape and texture ltering before combining them into a nal object. Fractal-based methods were used in (Voss 1983) for producing gaussian fractals; in fact, the shapes remain quite simple. Particle systems were introduced by (Reeves 1983) for many applications and especially for luminous explosion phenomena; unfortunately no real physical laws were used to model the dynamical counterpart of the system. Smoothed particles were used by (Stam and Fiume 1994) and are also used in this paper, mostly because an explosion looks spherical and can be well approximated by a blobby structure. As explained before, explosions are somewhat complex phenomena and drastic assumptions have to be avoided so as to simulate them correctly; notwithstanding, the rendering method has to take into account light attenuation and multiple scattering eects. While some analytical solutions can be computed for constant or simple density function (Max 1994a;Nishita et al. 1987;Blinn 1982b), the evaluation for a general density distribution calls for some approximation and discretization which may generate aliasing (Haas and Sakas 1992). For luminous objects like particles, a simple projection is used, each particle adding its own intensity (Reeves 1983). Transparent particles are simulated by motion-blur and a buer (Sims 1990) or time exposure (Takai et al. 1995). A simple density function, or light transport assumptions make possible the use of a scanline algorithm (Max 1986). Ray-tracing methods are generally used for single scattering approximation (Blinn 1982b;Max 1994a;Nishita et al. 1987;Stam and Fiume 1994), but new illumination models were introduced in (Kajiya and Von Herzen 1984) and (Inakage 1989) so as to solve multiple scattering eects. Actually, radiosity methods must be associated to a voxel data structure and can deal with multiple scattering in the zonal method presented by (Rushmeier and Torrance 1987). (Stam and Fiume 1995) proposed a radiosity-based algorithm for calculating radiation exchanges between surfaces and blobs, the internal scattering being processed by blob approximation. Gener-
alizations to non-isotropic phase functions (Bhate and Tokuta 1992;Patmore 1993;Max 1994b;Languenou et al. 1994) and optimization algorithms (Sillion 1994;Arques et al. 1996) have been proposed. However, their cost in memory and computing time is unsatisfactory owing to the dimension and the discretization of the phase functions. Mixed and/or modi ed algorithms are based on two or more passes and generally consist of a radiosity pass for coarse diuse light exchange, followed by a raytracing pass for specular re ection. Monte-Carlo-like methods combine a light propagation pass followed by a rendering pass. Although highly time-consuming, these methods can handle all physical phenomena of light transport. Solutions including participating media were proposed in (Rushmeier 1988;Blasi et al. 1994). Due to the uniform light emitters repartition over space, this method takes time for converging and needs lots of ray samples. The dynamical part of the phenomenon is insured by changing the model parameters leading to fractal eects. However, there is no solution to the fractal inverse problem of nding the optimal parameters. Instead, one has to state a few heuristics which allow an approximate parameter computation. However, such an approach is more suitable for static images than for animated pictures since human perception is very sensitive to image realism with respect to the time. Some work has been done on the evolution of wind elds (Stam and Fiume 1994). Modelling the wind is performed in dierent ways. (Sims 1990) and (Wejchert and Haumann 1991) used and combined some primitive functions, but no turbulence was generated. One way of adding turbulence is either to use a particle vortex (Gamito et al. 1995) and to suppose the uid is purely rotational, or to construct a wind eld (Stam and Fiume 1993) with a smooth scale and a turbulent scale. (Sakas 1993) directly de ned the density in which case the dynamics is performed by spectral translations. It is assumed in these methods that the particles are small enough not to interact with the wind, which is not true in case of an explosion where the blow is created at the same time as the particles. Another way of creating an interesting dynamical behavior is to use inter-particle forces based on the Lennard-Jones potential. For instance, (Luciani et al. 1995) proposed a method for creating particle forces with dierent ranges so as to yield turbulence. Cellular automata were used in (Agui et al. 1991) to simulate a
ame while (Takai et al. 1995) focused on the parallel aspect of the model. More or less approximate physical laws have been used for simulation. (Kajiya and Von Herzen 1984) pro-
posed some simple equations for clouds formation, (Inakage 1989) chose adequate parameters and used measured data for simulating a ame while (Perry and Picard 1994) used a more physical model for ame spreading. An advection-diusion equation was used in (Stam and Fiume 1995) for combustion parameter changes. Up to now, no paper has been published about the rst two steps of an explosion phenomenon.
2 MODELLING AND SIMULATING
The SPH model as introduced by (Monaghan 1982) is well adapted to gas modelling and widely used in various scienti c domains. Basically, any function f : R ! R can be replaced by the integral Z f(y)W(jx yj; h)dy R
provided that the kernel W approaches the Dirac function as the smoothing length h tends to 0: Z W(jx yj; h)dx = 1 R
The integral is further approximated by a sum over all particles: X f(x) mi fi W(jx xi j; h) i i with i = (0) = mi W(0; h). (Blinn 1982a) used smoothed particles for generating isosurfaces and called them blobs. Frequently used kernels are gaussian-like or polynomial-based functions. Our work modi es the polynomial kernel de ned in (Murakami and Ichihara 1987) but generalization to higher degrees is straightforward: W(jx xij; Ri) = Ci 1
jx xi j 2 Ri
!2
particles are implemented as blob single elements with in nite density. Following the start of the explosion, a system of blobs begins to grow. Its evolution rules with respect to density, thermal energy, particle speed, radius and position are given by Monaghan. We slightly modi ed these equations by adding chemical rules and measured data, to account for dierent gas types, temperature increase, mass exchange and changing optical parameters due to combustion. Three gas types are considered, that is 3 sets of parameters for every particle.
3 PARALLEL RENDERING
The luminous eects in the rst steps of an explosion consist in a high self-emission of light, and therefore multiple scattering can be neglected. Self-emission illumination, like density, is approximated on the same blob elements by a quartic polynomial I(x). Given a blob with smoothing parameter Si , its intersection with a ray is computed and a degree four polynomial easily derived from (1) into ray coordinates t. Spending additional memory cost, the polynomial coecients have to be calculated only once per ray which can save computing time if blob elements overlap. As assumed above, multiple-scattering is neglected, yielding a simpli ed illumination equation: I =
Z1 0
e (0;t) (1 )(t) I(t) dt
(2)
where I is the total intensity, the albedo, I(t) the local intensity or emission (for the sake of simplicity, the scene contains no other object and has Rno background). and the density function. (t0 ; t1) = tt01 (t0 )dt0 is the optical depth, and e (t0 ;t1 ) gives the light attenuation between ray coordinates t0 and t1 .
(1)
where W is de ned within the ball i of radius Ri and center xi. Ci = 32105 R3i is the normalization factor. Two totally dierent cases for particle dynamics have to be considered: the gas evolution and the solid particles ejection. When an explosion occurs, solid elements are kicked out of the blob zone. There is no interaction between them and the gas, mainly because they quickly vanish. The trajectory is governed by gravity and they are considered as \classical" particles, as described in (Reeves 1983). For data structure consistency, classical
Figure 1: Ray blob intersection The general algorithm consists in collecting all intersections with blob elements and constructing an interval list. Proceeding from front to back, on each in-
terval [t0; t1] the density and local illumination polynomials are calculated as the sum of every contributing blob element. The integral is then approximated by replacing the attenuation integral with aR linear function (t0 ; t) a(t t0) where a(t1 t0 ) = tt01 (t) dt is the mean density. Equation (2) becomes : I
Z
t1
Z
t0 t1
e a(t t0) (t) I(t) dt 8
X (e a(t t0) (ai ti))dt t0 i=0 8 Z t1 X e a t ti dt) = eat0 (ai
=
i=0
t=t0
(3)
which can be computed analytically. Nevertheless, equation (3) is subject to numerical instabilities either with low density mean values or for small intervals. Hopefully, one can replace all the attenuation terms by a linear operator overcoming the lack of precision: e (t0 ;t) e att 1 t (t0 e att 1 1) for a < 0:1 where t = t1 t0 and t 2 [t0; t1]. Equation (2) becomes an integral of a degree nine polynomial. For higher densities, reducing the error is performed by evaluating the integral with the relation : Z at at ea t ti dt = ea ti ai (::: a1 ea ) (4) Of course, for highly varying density functions, the error becomes too important when replacing the attenuation integral by its linear version. The solution is to split the interval and perform the calculation on each subinterval. Experience shows that only one subdivision is necessary to reduce the error considerably. On the other hand, Simpson's integration scheme needs at least four subintervals and often more. A simple approach could be to regularly split every interval according to a xed length but it is much more interesting to proceed adaptatively to save computation time. Optimization consists in choosing the current attenuation as a subdivision criterion. However, owing to the high intensity involved, a ray cannot be cut only with this parameter, because a large luminous ux coming from behind could illuminate the pixel. So, a preprocess is added to memorize for each interval the possible maximum intensity coming from behind. This algorithm performs from back to front. Finally, one must notice that for a symmetric kernel and isolated particle, the interval must be split into at least three parts, if subdivision is needed. Parallelization of the rendering process can be performed either by distributing each part from image or
object space to a processor, or by pipelining the tasks, for example assigning one interval to each processor and collecting attenuation and intensity along the ray. Note that equation (4) can be easily pipelined. Experience shows that better results are obtained when using image subdivision, certainly because it is harder to optimize time in other methods. Our implementation makes use of the PVM parallel communication protocol, running on a CM5 parallel machine, but not restricted to. A dedicated processor sends line numbers and collects results to get a nal image. We focused on reliability and eciency in sending duplicate lines for fast termination of the rendering. Dazzling, which is a very important phenomenon in an explosion, is due both to intensity over ow at one point and to rapidly increasing illumination. We will suppose that intensity over ow is uniformly redistributed over the whole picture and that there is no in uence between wavelengths (intensity surplus in one wavelength is not redistributed into another). These two assumptions allow a fast evaluation of dazzling as a 3 steps postprocessing algorithm. in the rst step and for all discretized wavelengths, over ow and left intensity are summed and pixels counted for each energy level. Colors are represented by the classical RGB triplet, and 256 intensity levels. The suitable increasing value for each energy level is calculated in the second step, without over owing pixels and uniformly distributing the intensity. Finally in the last step, each pixel is augmented by the appropriate intensity or cut if over owed. The Clike algorithm is: for all pixel p[x][y] if p[x][y] > 255 over += p[x][y] - 255 else sum += 255 - p[x][y] c[p[x][y]]++ for i = 255 to 0 inc[i] = (over * (255 - i))/sum if(over * (255 - i) % sum > sum / 2) inc[i]++ over -= c[i] * inc[i] sum -= c[i] *(255 - i) for all pixel p[x][y] if(p[x][y] > 255) p[x][y] = 255 else p[x][y] += inc[p[x][y]]
where over is the total amount of over ow intensity, p the pixel value and sum the maximum global intensity increase the picture can stand. Of course, if oversum all pixels hold the maximum energy level. Energy is
not exactly conserved, due to discretization and integer division. A more sophisticated algorithm uses an additional array of probabilities ( oating part of inc) and random numbers are generated to add or remove one unit to the pixel value. We choose to model dazzling with a function f that modi es the over value after the rst step to take into account the non-linearity of the phenomenon and the sensor0 optical properties, f(over) = over0 (A + Be Cover ) where over0 is the total intensity over ow and all parameters A, B and C are optically measured datas. Temporal aspect is included as a function of pixel temporal changes. Finally, the implementation of the rst and last steps on a parallel machine is straightforward. Second step has low and constant computation time. Additional datas are a couple of arrays of 256 values which yields a negligible increase of memory use.
CONCLUSION AND RESULTS
A model based on smoothed particles and a rapid rendering approximation for highly emissive density volume has been proposed. Dynamics is done by modi ed particles systems rules, including both smoothed particles dynamics, chemical evolution rules and measured datas. Polynomial kernel and linear approximation of the attenuation integral allow an analytical solution for the illumination integral. For example, picture 2 used 270 particles and was generated in 15 seconds (including simulation, rendering and dazzle postprocessing) by the CM5 (32 SUN Sparc2 processors) in 480300. Simulation needs lots of computation time since, as in any N-body problem, the rules used are a summation over all particles. Hopefully, the kernel falls to zero when outside the surrounding sphere, which locally allows solution calculation and numbers of optimizations where proposed, in particular hierarchical voxel grids, to reduce algorithm complexity. A postprocessing dazzling optimized according to some assumptions has been proposed. Further work will include a non uniform diusion algorithm and more complex functions depending not only on total over ow energy, but also on statistical properties. An other optimization, not yet implemented consists in redistributing the over ow intensity \on the y", during rendering calculation. The advantage of this is a gain of computing time when using the dazzle intensity information for coarser evaluation of the integral at other pixels. This method would be more ecient if rendering was not done sequentially line by line, but beginning with highly radiative pixels rst (approximation can be determined by projection methods). However, it should
not be used for low dazzled pictures.
Figure 2: Explosion made of 270 blobs The authors would like to thank Bernard Perroche (E cole des Mines de Saint-Etienne, France) for fruitful advises and the time spent.
REFERENCES Agui T., Kohno Y. and Nakajima M. 1991. \Generating 2D Flame Images in Computer Graphics", Trans Inst Electronics Inf Commun Eng, J74{D{2: 184{189. Arques D., Michelin S. and Azarian S. 1996. \Extension de la radiosite classique aux milieux semi-transparents: developpements mathematiques pour la methode zonale", Revue de CFAO et d'informatique graphique, vol. 11, no. 1{2: 91{109. Bhate N. and Tokuta A. 1992. \Photorealistic Volume Rendering of Media with Directional Scattering", Proceedings of the 3rd Eurographics Workshop on Rendering, (May): 227{245. Blanc-Talon J. 1997. \Performance Characteristics of Computer Vision Algorithms", ETCA, 97-R-05, Arcueil, France. Blasi P., Le Saec B. and Schlick C. 1994. \An Importance Driven Monte-Carlo Solution to the Global Illumination Problem", 5th Eurographics Workshop on Rendering, (Jun): 173{183. Blinn J.F. 1982. \A Generalization of Algebraic Surface Drawing", ACM Transactions on Graphics, (Jul), vol. 1, no. 3: 235{256. Blinn J.F. 1982. \Light Re ection Functions for Simulation of Clouds and Dusty Surfaces", ACM Computer Graphics, (Jul), vol. 16, no. 3: 21{29.
Gamito M.N., Lopes P.F. and Gomes M.R. 1995. \Two{ dimensional simulation of gaseous phenomena using vortex particles", 6th Eurographics Workshop on Animation and Simulation, (Sep): 3{15. Gardner G.Y. 1985. \Visual Simulation of Clouds", ACM Computer Graphics, (Jul), vol. 19, no. 3: 297{303. Haas S. and Sakas G. 1992. \Methods for Ecient Sampling of Arbitrary Distributed Volume Densities", Photorealism in Computer Graphics (Proceedings Eurographics Workshop on Photosimulation, Realism and Physics in Computer Graphics, 1990): 215{27. Inakage M. 1998. \A simple model of ames", Proceedings of Computer Graphics International: 71{81. Kajiya J.T. and Von Herzen B.P. 1984. \Ray Tracing Volume Densities", ACM Computer Graphics, (Jul), vol. 18, no. 3: 165{174. Languenou E., Bouatouch K. and Chelle M. 1994. \Global Illumination in Presence of Participating Media with General Properties", 5th Eurographics Workshop on Rendering, (Jun): 69{85. Luciani A., Habibi A. and Duroc Y. 1995. \A Physical Model Of Turbulent Fluids", 6th Eurographics Workshop on Animation and Simulation, (Sep): 16{29. Max N. 1986. \Atmospheric Illumination and Shadows", ACM Computer Graphics, (Aug), vol. 20, no. 4: 117{124. Max N. 1994. \Ecient Light Propagation for Multiple Anisotropic Volume Scattering", 5th Eurographics Workshop on Rendering, (Jun): 87{104. Max N. 1994. \Computer Animation of Clouds", Proceedings of Computer Animation '94, (May): 167{174. Monaghan J.J. 1982. \Why Particle Methods Work", SIAM J. Sci. Stat. Comput., vol. 3, no. 4: 422{433. Murakami S. and Ichihara H. 1987. \On a 3D Display Method by Metaball Technique", Journal of papers given by at the Electronics Communication, vol. J70-D, no. 8: 1607{1615. Nishita T., Miyawaki Y. and Nakamae E. 1987. \A Shading Model for Atmospheric Scattering considering Luminous Intensity Distribution of Light Sources", ACM Computer Graphics, (Jul): 303{308. Patmore C. 1993. \Simulated Multiple Scattering for Cloud Rendering", Graphics, Design and Visualization (IFIP Transactions B-9): 59{70. Perlin K. and Hoert E.M. 1989. \Hypertexture", ACM Computer Graphics, (Jul), vol. 23, no. 3: 253{262. Perry LC.H. and Picard R.W. 1994. \Synthesizing Flames and their Spreading", Proceedings of the 5th Eurographics Animation and Simulation Workshop. Reeves W.T. 1983. \Particle systems { a technique for modelling a class of fuzzy objects", ACM Computer Graphics, (Jul), vol. 17, no. 3: 359{376. Rushmeier H.E. 1988. \Realistic Image Synthesis for Scenes with Radiatively Participating Media", Ph.D. The-
sis, Cornell University. Rushmeier H.E. and Torrance K.E. 1987. \The Zonal Method for Calculating Light Intensities in the Presence of a Participating Medium", ACM Computer Graphics, (Jul), vol. 21, no. 4: 293{302. Sakas G. 1993. \Modeling and Animating Turbulent Gaseous Phenomena using Spectral Synthesis", vol. 9: 200{ 212. Sakas G. and Kernke B. 1994. \Texture Shaping: a method for modelling arbitrarily shaped volume objects in texture space", Photorealistic Rendering in Computer Graphics, (May): 206{218. Sillion F. 1994. \Clustering and Volume Scattering for Hierarchical Radiosity Calculations", Proceedings of the 5th Eurographics Workshop on Rendering, (Jun). Sims K. 1990. \Particle animation and rendering using data-parallel computation", ACM Computer Graphics, (Aug), vol. 24, no. 4: 405{413. Stam J. and Fiume E. 1993. \Turbulent Wind Fields for Gaseous Phenomena", ACM Computer Graphics, (Aug): 369{376. Stam J. and Fiume E. 1994. \Stochastic Rendering of Density Fields", Proceedings of Graphics Interface '94, (May): 51{58. Stam J. and Fiume E. 1995. \Depicting Fire and Other Gaseous Phenomena Using Diusion Processes", Computer Graphics Proceedings, Annual Conference Series, 1995: 129{136. Takai Y., Ecchu K. and Takai N.K. 1995. \A Cellular Automaton Model of Particle Motions and its Applications", The Visual Computer, vol. 11: 240{252. Voss R. 1983. \Fourier synthesis of gaussian fractals: 1/f noises, landscape and akes", ACM Computer Graphics (SIGGRAPH '83 Tutorial on State of the Art Image Synthesis, vol. 10. Wejchert J. and Haumann D. 1991. \Animation aerodynamics", ACM Computer Graphics, (Jul), vol 25, no. 4: 19{22. Fabrice Uhl : was born in Benfeld, France, in 1970. He received a master thesis from Strasbourg University and a Dipl^ome d'Etudes Approfondies in 1994 from the same institution. He is currently a PhD student at the ETCA. His elds of interest include Image Synthesis and Fractal Techniques. Jacques Blanc-Talon : was born in L'Ha y-les-roses, France, in 1962. He graduated as an Engineer at the EFREI, Paris, in 1984, received the Dipl^ome d'Etudes Approfondies from the University Paris XI, Orsay, in 1985, and the Doctorat d'Universite (Ph.D) from the same University in 1991. He held a visiting position at the Division of InformationTechnology (DIT) of the CSIRO in Canberra, Australia (ACT) from 1991 to 1993. He is currently a Senior Scienti c Consultant at the ETCA. His research interests include Formal Languages, Fractals, Image Processing, Data Fusion and Image Synthesis.