Squeeze Methods for Generating Gamma Variates Author(s): Bruce W. Schmeiser and Ram Lal Source: Journal of the American Statistical Association, Vol. 75, No. 371 (Sep., 1980), pp. 679-682 Published by: Taylor & Francis, Ltd. on behalf of the American Statistical Association Stable URL: http://www.jstor.org/stable/2287668 Accessed: 04-08-2017 15:33 UTC JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact
[email protected]. Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at http://about.jstor.org/terms
American Statistical Association, Taylor & Francis, Ltd. are collaborating with JSTOR to digitize, preserve and extend access to Journal of the American Statistical Association
This content downloaded from 193.194.76.5 on Fri, 04 Aug 2017 15:33:49 UTC All use subject to http://about.jstor.org/terms
Squeeze Methods for Generating Gamma Variates BRUCE W. SCHMEISER and RAM LAL*
2. THE ALGORITHMS
Two algorithms are given for generating gamma distributed random variables. The algorithms, which are valid when the shape parameter is greater than one, use a uniform majorizing function for the body of the distribution and exponential majorizing functions for the tails. The algorithms are self-contained, requiring only U(0, 1) variates. Comparisons are made to four competitive algorithms in terms of marginal execution times, initialization time, and memory requirements. Marginal execution times are less
Similar to the beta algorithms of Schmeiser and
Shalaby (1980), the points of inflection and the mode are central to this algorithm. Define
than those of existing methods for all values of the shape parameter,
as implemented here in FORTRAN.
X1 = X2(1 - 1/(X3 - X2))
x2 = max(0, X3 - X32)
KEY WORDS: Gamma distribution; Simulation; Random number generation; Rejection methods; Monte Carlo; Distribution sampling.
X3 = a- 1
X4 = X3 + X32
1. INTRODUCTION
X5 -X4(1 + 1/(X4 - X3))
During the last few years many algorithms have been
Here X3 iS the mode, x2 and X4 are the points of inflection
developed for generation of gamma random variables having density function
of f(x), and xi and X5 are the points at which the tangents of f(x) at x2 and X4 cross the X axis. If a < 2, there is no
left point of inflection, and xI = x2 = 0. These points
f-,(x) =xa-l exp (-x)lr(at) O f (x)
f(X3) = 1. The majorizing function is
for all x and b(x) < f(x) for all x). Then
t (x) = f (X2) exp (-XL(X - X2)) 0 < X < X2 1. Generate x having density r(x) =t(x)/If -t(y)dy.
2. Generate v - U (0, 1).
=
3. If v < b(x)/t(x), deliver x.
1
X2
p5, go to step 8. Otherwise, set X=X2+(X3-X2)W. If (u-p4)/(p5-p4)?W, deliver x. Otherwise, set v = f2 + (U - p4)/(X3 - X2)
The expected number of U (0, 1) variates required per gamma variate generated is
2[f,(x3)(x4 - X2) + X2f-(X2)/(X3 - X2) + x4fy(x4)/(x4 - X3)]
and go to step 13.
8. If U > P6, go to step 9. Otherwise, set X = X3 + (X4 - X3)W -
If (P6 - U)/ (P6 - P5) 2 w, deliver x. Otherwise, set
V = f4 + (U - p5)/(X4 -X3) and go to step 13.
9. If u > p8, go to step 11. Otherwise, sample W2 '-U(0, 1). If W2 > W, set w = w2. If U > P7, gO to step 10. Otherwise, set x = xl + (x2 - xl)w, v = fl + 2w(u - p6)/(x2 - Xl). If v < f2w, deliver x. Other-
for G2PE and is
[fy(xl)(X2 - X1) + fy(x2)(x3 - X2)
+ f (X4)(X4 - X3) + f (X5)(X5 - X4)] + 2[(fy(x3) - f-y(X2))(X3 - X2) + (fy(X3) - fy(X4))(X4 - X3) + Xlfy(Xl)/I(X2 - X1) + x5fy(x5)/(x5 - X4)] + 3[(f-y(X2) - f-y(Xl))(X2 - X1)
wise, go to step 13.
10. Set x = X5- W(X5 - X4),
v = f5 + 2w(u - p7)/(X5 - X4) If v < f4W, deliver x. Otherwise, go to step 13. 11. If u > pg, go to step 12. Otherwise, set U = (p9 - U)/(p9 - P8) ,
x = x- (ln U)/XL. If x < 0, go to step 3. If
W < (XL(Xl - X) + 1)/U , deliver x. Otherwise, set v = wflu and go to step 13.
12. Set u = (plo - u)/(plo - pg),x = X5 - (ln u)/R. If w < (XR(x5 - x) + 1)/u, deliver x. Otherwise, set v = Wf5u.
13. If In v ?< X3 ln (X/X3) + X3 - x, deliver x. Otherwise, go to step 3.
Algorithms G2PE and G4PE, as implemented in FORTRAN for this study, may be obtained from the authors.
+ (f-(X4) - f-(X5))(X5 - X4)]/2 for G4PE.
3. COMPUTATIONAL RESULTS Based on the findings of Cheng (1976), Kinderman and Monahan (1978), Marsaglia (1977), and Tadikamalla (1978b), it appears that the four fastest existing FORTRAN-level algorithms are to be found in these papers. Using the names used in these papers, we have denoted the algorithms GB, GRUB, RGAMA, and GAMMA, respectively.
The table compares GB, GRUB, RGAMA, GAMMA, G2PE, and G4PE in terms of generation time per variate, initialization time, and memory requirements. The times are based on the generation of 10,000 variates and are accurate to within .02 milliseconds. The algorithms were coded in FORTRAN on the CDC CYBER 72 computer at Southern Methodist University.
The uniform variates were generated by the relatively fast RANF, which is intrinsic to the FTN compiler,
This content downloaded from 193.194.76.5 on Fri, 04 Aug 2017 15:33:49 UTC All use subject to http://about.jstor.org/terms
682 Journal of the American Statistical Association, September 1980 and by GGUBF in the International Mathematical and
Statistical Libraries, Inc., (IMSL) (1979), which is about five times slower than RANF. While the marginal execution times are larger using GGUBF, the relative rankings do not change greatly.
Each of the algorithms requires a one-time initializa-
tion. In order of increasing set-up time the algorithms are GB, RGAMA, GAMMA, G2PE, GRUB, and G4PE.
Since the algorithms with lower marginal times tend to have higher set-up times, a trade-off is made that depends on M, the required number of variates for a fixed value of a. For a < 2, RGAM1A is fastest for Ill < 3
and G4PE is fastest for M > 4, and for a > 2, RGAMA is fastest for M < 5 and G4PE is fastest for M > 6, when using RANF. The result changes slightly using GGUBF, where for a < 2, RGAMA is fastest for Ill < 2
and G4PE is fastest for M > 3, and for a > 2, RGAMA and GB are fastest for M < 4 and G4PE is fastest for
M > 4. Of course the specific breakeven points will differ with various implementations, but these results
give an indication of the effect of different U (O, 1) generators. Specifically, slower U'(0, 1) generators tend to favor algorithms having higher set-up times, assuming
the expected number of U'(0, 1) variates is about the same.
Ease of implementation may be crudely measured in terms of lines of code and additional algorithms needed. In ascending order of lines of code the algorithms are
GB, RGAMA, GAMMA, GRUB, G2PE, and G4PE, although the first four require very close to the same
number of lines. The only additional algorithm needed is the normal generator used by RGAMA. [Received August 1978. Revised June 1979.] REFERENCES Ahrens, J.H., and Dieter, U. (1974), "Computer Methods for Sampling From Gamma, Beta, Poisson and Binomial Distributions," Computing, 12, 223-246.
Atkinson, A.C. (1977), "An Easily Programmed Algorithm for Generating Gamma Random Variables," Journal of the Royal Statistical Society, Ser. A, 140, 232-234. Atkinson, A.C., and Pearce, M.C. (1976), "The Computer Generation of Beta, Gamma and Normal Random Variables," Journal of the Royal Statistical Society, Ser. A, 139, 431-461. Cheng, R.C.H. (1977), "The Generation of Gamma Variables With Non-integer Shape Parameter," Applied Statistics, 26, 71-75. Fishman, George S. (1973), Concepts and Methods in Discrete Event Digital Simulation, New York: John Wiley & Sons. (1976), "Sampling From the Gamma Distribution on a Computer," Communications of the ACM, 19, 407-409. Greenwood, A.J. (1974), "A Fast Generator for Gamma-distributed Random Variables," in COMPSTAT: Proceedings in Computational Statistics, eds. G. Bruckman, F. Ferschl, L. Schmetterer, Vienna: Physica-Verlag, 19-27. International Mathematical and Statistical Libraries, Inc. (1979), Sixth Floor-GNB Building, 7500 Bellaire, Houston, Tex. Johnk, M.D. (1964), "Erzeugung von Betaverteilten und Gammaverteilten Zufallszahlen," Metrika, 8, 5-15. Kinderman, Albert J., and Monahan, John F. (1978), "Recent Developments in the Computer Generation of Student's t and Gamma Random Variables," Technical Report AMD-799, BNL24679, Brookhaven National Laboratory (preliminary version). Kinderman, A.J., and Ramage, J.G. (1976), "Computer Generation of Normal Random Variables," Journal of the American Statistical Association, 71, 893-896. Marsaglia, George (1977), "The Squeeze Method for Generating Gamma Variates," Computers and Mathematics With Applications, 3, 321-325.
McGrath, E.J., and Irving, D.C. (1973), Techniques for Efficient Monte Carlo Simulation. Vol. II. Random Number Generation for Selected Probability Distributions, Springfield, Va: National Technical Information Service.
Schmeiser, Bruce W. (1980), "Generation of Variates From Distribution Tails," Operations Research, 28, forthcoming. Schmeiser, Bruce W., and Shalaby, Mohamed A. (1980), "Acceptance/Rejection Methods for Beta Variate Generation," Journal of the American Statistical Society, 75, 673-678. Tadikamalla, Pandu R. (1978a), "Computer Generation of Gamma Random Variables," Communications of the ACM, 21, 419-422. (1978b), "Computer Generation of Gamma Random Variables-II," Communications of the ACM, 21, 925-928. Wallace, C.S. (1976), "Transformed Rejection Generators for Gamma and Normal Pseudo-Random Variables," The Australian Computer Journal, 8, 103-105. Wallace, N.D. (1974), "Computer Generation of Gamma Random Variates With Non-integral Shape Parameters," Communications of the A CM, 17, 691-695.
Whittaker, J. (1974), "Generating Gamma and Beta Random Variables With Non-integral Shape Parameters," Applied Statistics, 23, 210-214.
This content downloaded from 193.194.76.5 on Fri, 04 Aug 2017 15:33:49 UTC All use subject to http://about.jstor.org/terms