American Association for the Advancement of Science. According to the ... (All purchases can be made through the IEEE Computer Society. Press, 10662 Los .... Eric Dubois is an Associate Professor at INRS-Telecommunications, a research institute ... He received the Licence Es-Sciences degree in mathe- matics from the ...
694
IEEE TRANSACTIONS
however, that multiterminal information theory research papers will refer to Csiszti and Kiirner with steadily increasing frequency for many years to come. Any serious Shannon theorist who does not become at least moderately familiar with this important book is risking accelerated obsolescence.
Secure Communications and Asymmetric Cryptosystems, G. J. Simmons, Ed. (Boulder, CO: Westview Press, 1982, 338 pp. (including index), ISBN O-86531-338-5, $30.00).
N. J. A. SLOANE,
FELLOW,
IEEE
This book appears as part of the Selected Symposia Series of the American Association for the Advancement of Science. According to the introduction, this series is intended “to provide a means for more permanently recording and more widely disseminating material which is discussed at the AAAS Annual National Meetings.. The format is designed to provide for rapid dissemination of information, so the papers are not typeset but are reproduced directly from the camera-copy submitted by the authors ” And then we are told that this book is based on a symposium that was held at the 1980 AAAS National Annual Meeting in San Francisco. So far, so good. When the reader opens the book, he finds that indeed the papers are not typeset, they have just been typed. But upon closer inspection, one discovers that the papers look rather familiar. In fact, at least six of the ten papers in this book are well-known papers that have already been “widely disseminated” in an even more permanent form, namely: 1) R. C. Merkle, “Protocols for public key cryptosystems,” Commun. Ass. Comput. Mach., 1981.
2) W. Diffie and M. E. Hellman, “New directions in cryptography,” IEEE Trans. Inform. Theory, vol. IT-22, no. 6, pp. 644-654, Nov. 1976. 3) R. C. Merkle, “Secure communications over insecure channels,” Commun. Ass. Comput. Mach., 1918.
4) R. C. Merkle and M. E. Hellman, “Hiding information and signatures in trapdoor knapsacks,” IEEE Trans. Inform. Theory, vol. IT-24, no. 5, pp. 525-530, Sept. 1978. 5) R. L. Rivest, A. Shamir, and L. M. Adleman, “A method for obtaining digital signatures and public key cryptosystems,” Commun. Ass. Comput. Mach., 1978.
6) G. J. Simmons, “Symmetric and asymmetric encryption,” Comput. Sumeys, 1979. One wonders why it was necessary to take papers that have already been published (and typeset) and publish them in typescript. The remaining four papers in this volume are the following: 7) H. C. Williams, “ Computationally “hard” problems as a source for cryptosystems.” 8) W. Diffie, “Conventional versus public key cryptosystems,” (also presented at ICC ‘79). 9) G. J. Simmons, “Message authentication without security.” 10) W. Diffie, “Cryptographic technology: Fifteen-year forecast. Incidentally, the papers by Diffie and Hellman; Merkle and Hellman; Rivest, Shamir, and Adleman; and the 1978 Merkle paper have also been reprinted (verbatim, not in a retyped version) in Donald W. Davies’ superb collection of papers The Security of Data in Networks, IEEE Computer Society Press, 1981, #U366, Member: $15.00, Nonmember: $20.00. (All purchases can be made through the IEEE Computer Society Press, 10662 Los Vaqueros Circle, Los Alamotis, CA 90720.) Davies’ collection contains 22 of the best recent papers in cryptography, and is a much better value than the book under review.
ON INFORMATION
THEORY,
VOL.
IT-30, NO. 4, JULY 1984
Digital picture Processing, 2nd ed., A. Rosenfeld and A. C. Kak (New York: Academic, 1982, Vol. 1: xiii + 435 pp., Vol. 2: vii + 349 pp.). ERIC DUBOIS
MEMBER,
IEEE, AND
AMAR MITICHE,+
MEMBER,
IEEE
This new edition of Digital Picture Processing is a revised and expanded version of the popular first edition, published in 1976. The book is now in two volumes, the contents of which are only loosely dependent. The first volume presents introductory fundamentals and image processing techniques such as coding, restoration, and reconstruction. The second volume is dedicated to image analysis techniques. The introduction is included in both volumes, to make each self-contained. VOLUME
1
Contents: Chapter 1: Introduction. Chapter 2: Mathematical Preliminaries. Chapter 3: Visual Perception. Chapter 4: Digitization. Chapter 5: Compression. Chapter 6: Enhancement. Chapter 7: Restoration. Chapter 8: Reconstruction. Chapters 1 through 7 are revised and expanded versions of the corresponding chapters in the first edition, while Chapter 8, on reconstruction, is new. The volume is essentially limited to the processing of still imagery, as opposed to time-varying television-type imagery. Chapter 2 covers the basics of linear processing of pictures and random field models. Most of the development is for images that are continuous functions of space.The processing of discrete space functions (i.e., sampled images) is limited to block processing using a vector space approach, while the well-known theory of digital signal processing and two-dimensional digital filtering is not mentioned. The section on random fields is particularly welcome, although again a presentation reflecting the discrete space nature of digital pictures would be more appropriate. Some of the material on Markov models for images included in later chapters could also have been presented here. Chapter 3, presenting a number of properties of the human visual system applicable to image processing, is essentially unchanged from the first edition, This chapter is very descriptive, giving no quantitative representation of light (i.e., photometry) or of specific properties of the human visual system that can serve to develop a mathematical model of visual perception. Such models will be essential in the opiimization of algorithms for the processing of images destined for human viewing. Chapter 4, on digitization, discussesthe issues of sampling and quantization of pictures. This chapter is also essentially unchanged from the first edition. The topic of nonorthogonal sampling lattices is included. This topic has received considerable attention recently, although it is of more significance for moving images. The important topic of subjectively optimal quantization of pictures, as described for example by Kretz (F. Kretz, “Subjectively optimal quantization of pictures,” IEEE Trans. Commun., vol. COM-23, pp. 1288-1292, Nov. 1975) is a notable omission from this chapter. Chapter 5 discusses compression, otherwise known as source coding. There has been new material included on transform coding, especially relating to the Karhunen-Lo&e transform and the discrete cosine transform, and the concept of bit allocation is now treated. New sections on block truncation coding (a kind of adaptive quantization) and error-free compression have also been added. Absent from this chapter is any discussion of coding with respect to a visual fidelity criterion. Chapters 6 and 7 deal with image enhancement and restoration, respectively. Image enhancement is largely a process of image restoration when there is little specific knowledge about the degradation process. Chapter 6 contains a welcome addition on analysis of illumination effects, although this material on image formation should probably be placed much earlier in the book. This chapter contains many photos illustrating the techniques, although they are often too small for the reader to properly evaluate the results. Many of the sections in Chapter 7 have been substantially enlarged, and much new material has been included, including the maximum aposteriori method and the maximum entropy method. TVolume 1 has been reviewed by E. Dubois and Volume 2 by A. Mtiche.
IEEE TRANSACTIONS
ON INFORMATION
THEORY,
VOL.
IT-30,
NO.
4,
JULY
However, there is little discussion of the relative advantagesand disadvantages of the various methods, especially in terms of performance versuscomplexity. Chapter 8 is completely new, presenting the theory of image reconstruction from projections. A detailed description of the techniquesof reconstruction from parallel projections and fan projections, as well as algebraic reconstruction techniques, are given. Computational considerations and the effects of noise and aliasing are treated.
VOLUME
2
695
1984
matics from the University of Algiers, North Africa, and the Ph.D. degree in computer sciencefrom the University of Texas at Austin. He was a Research Scientist with the Laboratory for Image and Signal Analysis of the University of Texas at Austin. His current research interest is in computer vision.
BOOKS BECEIVED MICHAEL A. KAPLAN
Contents: Chapter 1: Introduction. Chapter 9: Matching. Chapter 10: Fourth Symposiumon Information Theory in the Benelux, E. C. van der Segmentation.Chapter 11: Representation.Chapter 12: Description. Meulen, Ed. (Leuven: Acco, 1983, 198 pp.). The secondvolume includes a new chapter on matching and expanded Information theorists in Belgium, The Netherlands, and Luxemburg chapters on segmentation, representation, and description. But the relameet in May of each year. The present volume records the twenty-four tively new topic of dynamic scene analysis has not been treated (although papers given at the 1983 meeting. Two of the twenty-four-“Primality it is briefly mentioned in an appendix in Chapter 9). The cause for this and Factorization” (H. W. Lenstra, Jr.) and “Logarithms in Finite omission is certainly the fact that it is very difficult to keep pace with the Cyclic Groups-Cryptographic Issues”(J. L. Massey)were invited. The rapidly growing field of image processing. twenty-two contributed papers-many in summary form-are preChapter 9 is on matching and covers imaging geometry, registration, sented in six sessions: Coding Theory, 5 papers; Cryptography, 2 geometric transformations, and match measurements. Although this papers; Detection, Estimation, and Sampling, 4 papers; Image Analychapter contains much new material, it does not meet expectations.Image sis, Image Modelling, and Image Processing, 3 papers; Information formation has not been discussedin length, and adequatereferencing for Measures, 2 papers; Multi-User Information Theory, 6 papers.A total of further reading has not been provided. The very convenient homogeneous seven of the twenty-two presentations(including one from the coding coordinate representation has not been mentioned in the discussion on session)deal with the multi-user problem and related channel models coordinate transformations. Moreover, imaging transformations have been (binary-adder, binary-multiplier, binary erasure),giving multi-user work describedusing the pinhole projection model with center of projection in a very strong presence in the Proceedings.Other topics represented front of the camera. Although equivalent, the central projection model, include the following: Digital Convexity and Straightness on the modified not to invert images, would be more appreciated since it is a Hexagonal Grid, Convolutional Decoder with Reliability Information, much more popular model among the image processingcommunity. On the Weight Enumerator for Self-Dual Codes, Determination of the Segmentation is treated in Chapter 10, which is by far the largest Global Extremum of a Function of Several Variables, Bounds on the chapter. This is normal, considering the scopeof the subject. Most aspects Sampling Rate for Short-Time Narrow-Band Signals,Implicit Sampling of image segmentation are thoroughly discussed,and many good examModel for Images, and Properties of Spectral Distortion Measures. ples are given. What seems to be missing is a taxonomy that would provide an overall understanding of the segmentation process indepen- Probability Theory and Computer Science,G. Louchard and G. Latouche, dently of particular segmentation schemes;a useful addition would be a Eds. (New York: Academic Press,1983, xiii + 209 pp.). general discussion on how assumptions about images can influence the An overview in three parts of stochasticperformance modeling. Part choice of segmentationtechniques. I-“Stochastic Modeling: Ideas and Techniques”-is by D. P. Gaver; Chapter 11 is on representation and describesin detail various represenit summarizes the popular models (Markov, renewal, diffusion aptation schemes,conversion between representations,and geometric propproximation, special distributions), pointing to recent literature for erty measurements.Curve and border representation are discussed in additional detail and useful variants. Part II-“Stochastic Modeling: length (should shape analysis be treated as a subject apart?). Queueing Models”-is an applications-oriented review of basic queueChapter 12 is on picture description, which deals with the specification ing theory by H. Kobayashi, whose survey articles on this topic are by of properties of parts of pictures and relationships between the parts. This now quite numerous. Part III, by R. Sedgewick,is almost orthogonal to chapter provides a clear discussion on various image properties such as the first two. Entitled “Mathematical analysis of combinatorial algomoments, properties of projections and transforms, and statistical and rithms,” it might reasonablybe describedas a primer to Volume III of textural properties. It also deals with image models, emphasisbeing put D. E. Knuth’s The Art of Computer Programming. on syntactic modeling. The volume is not a textbook. On the whole, however, it is a very Either volume in this second edition can serve as a textbook for a pleasant and well-motivated introduction to the topics covered, and specialty course in image processing.A limited number of exercisesare useful to the beginner as a pointer to the literature. included throughout the text; this is not a serious drawback since many image processingcoursestend to be project oriented. Overall, the book is characterizedby its clarity and style of exposition, as well as by its careful The Visual Display of Quantitative Information, Edward R. Tufte choice and balanced treatment of topics. It should prove to be an (Cheshire, CT.: Graphics Press,1983, 197 pp.). excellent referencemanual in theoretical and practical image processing. The topic here is data graphics-secondarily as art, primarily as a medium for information transfer. The first part (“Graphical Practice”) is an annotated collection of examples, drawn from the sixteenth Eric Dubois is an Associate Professor at INRS-Telecommunications, a century to the present, in which can be seen both the history and the research institute affiliated with the University of Quebec. He received the state of the art. The secondpart (“Theory of Data Graphics”), developB.Eng and M.Eng. degreesfrom McGill University, Montreal, in 1972 and ing the lessonslearned by case study in the first, is a guide to good 1974, respectively, and the Ph.D degree from the University of Toronto in technique and to some of the mistakes to avoid (“chartjunk”). The text 1978, all in electrical engineering. His interests are in the areas of digital is thoroughly supported with illustrations. This is a charming, wellsignal processing and image processing and coding. He has collaborated with wrought production on a topic that, though presently orthogonal to the the Visual Communications Systems group of Bell-Northern Research. main interests of the IT readership, is bound to assume a wider importance as more and more of us-through poster sessionsat conferences,computer typography, and advanced telecommunication netAmar Mitiche is a Research Associate at INRS-Telecommunications in work services-become involved in visual communications. Montreal, Quebec. He received the Licence Es-Sciences degree in mathe-