Max Planck Institute for the History of Science ...

6 downloads 275838 Views 23MB Size Report
undergraduates can understand from it with basic instruction. ... Schadwinkel, a scientific illustrator, who draws her three-dimensional images step by step.
Max-Planck-Institut für Wissenschaftsgeschichte

Max Planck Institute for the Histor y of Science

2009

P r epr i n t 3 8 0

Sabine Brauckmann, Christina Brandt, Denis Thief fr y, Gerd B. Müller (eds.)

Graphing Genes, Cells, and Embr yos Cultures of Seeing 3D and Beyond

Contents

Introduction Sabine Brauckmann and Denis Thieffry �������������������� 5 Acknowledgments����������������������������� 13

I Forma Dimensions of Embryos, Cells, and Molecules The Problem of Assessing the Dimensionality of Information Richard Burian������������������������������ 17

Building Simple Two-Dimensional Molecular Models to Illustrate a Complex Subcellular World Costis Papanayotou ���������������������������� 23

Imaging Fate: Tracking Cell Migration in the Developing Embryo Sema K. Sgaier������������������������������ 29

Pander, d’Alton and the Representation of Epigenesis Stephane Schmitt ����������������������������� 33

Pattern of Scanning, Tracing and Assemblage Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants Marianne Klemun ���������������������������� 41

Viewing Chromosomes Soraya de Chadarevian��������������������������� 57

Scales of Line, Circle, and Arrow Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development Silvia Caianiello ����������������������������� 65

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy Ariane Dröscher ����������������������������� 83

II Instrumentaria Tools Matter Agnes Arber: The Mind and the Eye Maura C. Flannery ���������������������������� 95

In Three Small Dimensions: X-Ray Microtomography as a Way of Seeing Brian D. Metscher�����������������������������105

Optical Projection Tomography: Revealing the Visible James Sharpe ������������������������������111

Tracing the mtDNA Trail Mait Metspalu ������������������������������117

Patterns of Perception Embryos and Empathy: The Model Experiments of Wilhelm His Thomas Brandstetter����������������������������123

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look Erna Fiorentini������������������������������133

The Cell as a Technical Image Matthias Bruhn �����������������������������143

Movement and the Creation of Images Silver Rattasepp �����������������������������153

Spaces of Interpretation What do Genetic Maps Represent (and How)? Marion Vorms ������������������������������161

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s Norberto Serpente�����������������������������177

III Rhetorica Visual Pedagogy Science Joins the Arts: The Collection of Watercolors and Drawings of Marine Organisms at the Stazione Zoologica Anton Dohrn Christiane Groeben ����������������������������195

Visualizing Sexual Differentiation: The Rhetoric of Diagrams Shelley Wall �������������������������������203 Authors and Editors ����������������������������209

Introduction Sabine Brauckmann and Denis Thieffry

Biologists have always visualized their objects to a greater extent than physicists or chemists. Since the early 19th century, hand-drawings, professional illustrations, idealized diagrams, microphotography are extensively used, followed by time-lapse motion pictures in the 20th century, for visualizing the data and supporting one‘s own analyses. Nowadays, current visual practices encompass video and digital imaging, including virtual dissections and rotating panoramas of embryonic features. In general, these scientific images are means to store and exchange experimental data and to generate specialized knowledge of biological objects. The objects can be individuals, like an embryo or egg, cells, biomolecules, or yet sets of animals or plants. In contrast with physical entities, they are visually flexible phenomena, whose boundaries, extension and identifying details are studied to explain dynamical structures and processes like embryogenesis, cell morphologies, or biomolecular networks. To capture this visual specificity of the life sciences, the BioGraph workshop series focuses on elaborated analyses of how the life sciences visualize and represent their objects of study. At the first Workshop held in Naples in May 2007, we analyzed (1) the biological material considered in priority, e.g. embryos, cells, and genes; (2) the type of graphical representations used, e.g. fate maps, cell lineages, or gene networks; and (3) the techniques employed to construct these graphs, e.g. hand drawings, diagrams, tables, micro-photographs, time-lapse motion pictures, or computer imaging. Our objective was to disclose the production of biological knowledge on embryos, cells, and genes, from the early 19th to in silico biology, by comparing the graphic models of classical disciplines evoking lifelike images within the mind, from embryology and cytology to most recent computer imaging techniques. We further elucidated in detail which experimental procedures schooled the scientists to coordinate eyes and hands when redrawing over and over again images of cells pushing against each other, or fixing the boundaries of embryonic layers, permanently moving and shifting their position inside an embryo. On the occasion of the second workshop of the series held in Berlin in June 2008, the imagery produced by current experimental or in silico research was contrasted with former observational studies depicting organisms, embryos, cells, genes, and molecules from around 1800 to the twenty-first century. We addressed the issue of how biological scientists manifest the dimensionalities living organisms exhibit when taking shape. From a historical perspective, moving from two dimensions to three dimensions has been an enormous problem, and still is, although biologists currently combine three-dimensional exploration across time with the characterization of gene expression patterns. Dimensionality treatment appears to differ widely among the biological disciplines. To fully trace the (epistemic) steps of representing three-dimensional specimens, the scientists offered detailed information about their techniques, tricks, and tools to visualize genes, cells, embryos, and species, using images of distinct scales and dimensions. In addition to clarifying the process of image construction, they explicated what their images reveal, what is filtered out, and why it is filtered out. In contrast, scholars presented historical case studies on the changing practices of visualization, encompassing the specificity of different biological disciplines, specific techniques, model organisms, and styles of communication. Their task was to delineate an epistemic archeology of spatial forms that occur naturally beyond the range of unaided vision, e.g., rotating embryos, differentiating cells and their components, describing the fine structure of organisms,



Sabine Brauckmann and Denis Thieffry

or illustrating the branching of species in genetic and developmental models of evolution. All these questions relating to the spatial and temporal deployment of individual organisms can be transposed to population and evolutionary levels. As the workshop unfolded, we followed the appearance and changing meaning of graphical symbols, either owing to different theoretical perspectives or simply because of technical constraints. For example, arrows were and are used to represent displacement or dimensional modification (e.g., in Baer’s observation of how a two-dimensional plate coils to three-dimensional tubes). Similarly, the development of three-dimensional organs such as Drosophila wings and legs is projected onto two-dimensional cell tissues (imaginal discs). In this respect, participants portrayed the corresponding specific styles of visualization and attempted to clarify the modes of mapping mental images or concepts to material models. In Berlin the following issues were addressed: (1) How did shifts in the conception of cells or genes relate to different styles of imaging? (2) To what extent have imaging techniques themselves influenced, directed, and modulated scientific observation and data analyses? (3) How does it feel to be inside a cell or inside a molecular space? (4) Which visual representations are utilized for scientific lectures, public talks, scientific articles, textbooks, and mass media publications, and how are they used? The workshop’s main objectives were (1) to reconstruct some chapters of the visual biography of genes, cells, and embryos in the life sciences, and (2) to trace the influence of specific issues, such as dimensionality, scale, and pattern, on biological imagery from around 1800 to the twentyfirst century. Furthermore, we initiated a practical collaboration on iconic images, which will be further explored during a forthcoming summer school in Tartu (Estonia), in 2010. BioGraph II was organized in three main parts, each about one day long. The first (forma) provided an overview of the main workshop topics, which were further elucidated and enhanced on the next day (instrumentaria) by debating the manual and epistemic tools used to visualize experimental data from the eighteenth century to the most recent innovations such as liveimaging or molecular tracing techniques. These technical aspects were complemented with contributions dealing with the epistemology of perception and the standards of observation, as well the rules of interpretation. The third day (rhetorica) focused on how to visually communicate biological data – in the classroom, at scientific conferences, for a wider audience, in science museums, and even in hospitals, e.g., in order to demonstrate disorders of sex development. Starting with FORMA, Richard M. Burian asked the essential question: what has visual representation not accomplished so far and what it cannot accomplish when it is restricted to the modes of visual representations commonly used to deal with (biological) information and (systems of) heredity? In plain words, the question was whether any linear or two-dimensional representation is adequate to capture the informational content of a sequence of events such as gastrulation. Following up, Costis Papanayotou described how simple two-dimensional drawings of interacting molecules are frequently used to represent conclusions reached by complicated experiments and how the results are analyzed. A similar challenge, namely how to represent a complex unfolding four-dimensional narrative on a simple two-dimensional page, was outlined by Sema K. Sgaier with her work on the genetic fate mapping of the embryonic cerebellum, which develops from a handful of identical cells into an elaborate three-dimensional structure. Irrespective of technical restraints, in both cases the images are limited to presenting only the information that is experimentally needed at that very moment. The historical counterpoint was provided by Janina Wellmann with an analysis of Pander’s seminal treatise on chick development. According to her approach, the establishment of the germlayer theory was largely a pictorial endeavor that forced the observer to grasp the dynamics of



Introduction

embryogenesis by perceiving the transition between the depicted embryonic stages in his mind. To emphasize the three-dimensionality of embryogenesis, Pander and D’Alton turned the embryo on its axis in their hand-drawn figures, presenting it both from its anterior and its posterior side to visualize the developmental process. At the Naples Workshop in 2007, Stephanè Schmitt also had presented the embryonic images of d’Alton. However, he focused on Pander’s epigenetic reasoning and its difference to the reflection of Karl Ernst von Baer about the same issue. According to Schmitt, the plates of d’Alton are interesting, from a epistemic point of view, since they comprise no legend in themselves, but are covered by a tracing paper with schematic interpretation and legends. This mode of representation can be connected with Pander’s conception of embryology as a gradual, epigenetic transformation (as opposed to preformation) with an intermediary stage, the formation of simple germ-layers. Turning to (plant) tracing, Marianne Klemun demonstrated that plant morphology worked with a mental image that derived abstract forms from the diversity of plant phenomena. For the three-dimensional exhibit, the herbarium sheet was artificially arranged and, thereby, transformed into an epistemic object beyond nature. She elaborated how the production of the drawing determined the visual end product; in the Icones Florae of Reichenbach it related to how the tracing paper mediated between the dried plant on paper and the drawing, incorporating the detailed expert knowledge. Resuming the discussion of scanning, pattern, and assemblage, Nancy A. Anderson showed that the recognition and investigation of pattern and rhythm in organic form at the ultramicroscopic level and using such techniques as x-ray crystallography has played an important role in the disciplines of molecular biology, genetics, and developmental biology. She related the ideas about pattern and rhythm in genetics to Waddington’s discussion about organic form in the essay he wrote for the important symposium “Aspects of Form” at the Institute for Contemporary Art in London in 1951. Soraya de Chadarevian examined how closely work on chromosomes has been associated with practices of visualization and explored the relations of particular chromosome images with the epistemic work surrounding chromosomes. She further discussed the early use of computers in analyzing chromosome pictures, thus probing the role of the human observer and highlighting the specificities of chromosomes as visual objects. Tackling iconic images of evolution, development, and genetics, Silvia Caianiello compared Waddington’s epigenetic landscape to Wright’s adaptive landscape, a diagrammatic representation of his theory of shifting balance. The strong appeal of Wright’s diagram, which appeared as an ingenious schematization of a highly sophisticated mathematical scaffolding, established its status as an image endowed with a heuristic power far more effective than the theory itself. she argued that the issue at stake in Waddington’s visualization was mainly the unification of embryology and genetics, although involving crucial consequences on evolutionary thinking, particularly concerning the actual role of chance. By comparing the visual material of Wright and Waddington from the 1930s to the 1950s, she concluded that the two models are utterly different in the scale of the evolutionary and genetic processes envisaged, where the dynamics of evolutionary change counters the buffered stability of developmental processes, as in the resulting direction of such processes. Extending the thread of FORMA to the cellular realm, Judy Johns Schloegel introduced Jennings’s studies of cell lineages and presented his visualization of cellular space as a manifestation of his pursuit of a via-media between the ideals of observation and experiment in the life sciences. By describing his sophisticated technique in handling eggs, pen and paper in his hands, she explained Jennings’ conviction that the camera lucida – emblematic of mechanical objectivity – was insufficient for the task. Rather, Jennings believed that his hybrid representational strategy brought about something more important, namely to visualize the relations of cells to one another



Sabine Brauckmann and Denis Thieffry

in a completely accurate fashion. He likewise chose to utilize optical sections of an intact egg rather than physical sections by changing the field of vision by moving the microscope stage. Another way to capture the time and space of the cell’s dimension and dynamics was rendered by Ariane Dröscher by demonstrating how electron microscopy focused the eye of the observer mentally and technically. Electron microscopy technique revealed many sources of error in the interpretation of Golgi apparatus images made by light microscopy, although the latter were highly illuminating about endocellular organization. To paraphrase this using the words of Klee, Golgi’s technique did not reproduce the visible, it made things visible. Continuing on with electron microscopy, Maria Strecht Almeida described how the images of erythrocytes produced by Marcel Bessis and coworkers starting in the 1950s provided new and powerful ways of visualizing these cells and their fates. Regarding the fate of old cells, Bessis’s images using different microscopic techniques and showing the engulfment of injured erythrocytes by phagocytes are clear and fascinating. Further, Strecht Almeida discussed the question of whether we are seeing a level of organization beyond their length scale when we look at those images. Under the rubric of INSTRUMENTARIA, Maura C. Flannery introduced the abundance of Arber’s plant drawings, focusing on the hard work of reasoning by eye and hand regarding the specimen observed through a microscope. Klee once stated that the result of such observations and dissections can be the ability to draw inferences about the inner object from the optical exterior, and what is more, to draw intuitive inferences. According to Flannery, Arber’s views can be applied to molecular biology where the diversity of proteins is reminiscent of the diversity of plant species and structures. Again, the computer is there to help, but making sense of the changing images on a computer screen requires a great deal of visual processing in the brain. Brian Metscher introduced the technique of computed tomography scanning imagery and demonstrated its power in examining direct three-dimensional imaging of small biological samples. This technic can now achieve resolution down to the size of individual cells. Using the technical offspring of labor-intensive methods of reconstructing sectioned samples, it is now possible to register morphological and molecular information in a single volumetric image. But imaging is not seeing: imaging is technical and computational, while seeing is perceptual and cognitive. This technology for direct microscopic three-dimensional imaging makes small biological structural systems visible to us, but we must also learn how to visualize threedimensional structures and relationships in our own minds, and to incorporate them into our theories and models. In contrast, James Sharpe aimed at fitting the whole (embryo) into the picture because a common direction for technical developments in microscopy has been to focus on smaller and smaller details of biological systems. The first question often asked about a given technique is: what is its resolution? However, the increasing recognition that biological specimens should also be studied in a more holistic way – as a complete system – means that a new goal for imaging is being recognized. Rather than always trying to dissect out as many small components as possible, we have to realize the need for a better overview of the whole system. Optical projection tomography (OPT) is one new technology suitable for whole-organ and whole-embryo imaging. Because it does not focus on the usual goal of revealing invisibly small components of specimens, it can rather be described as performing the slightly ironic task of revealing the visible – providing new perspectives on specimens that are in fact easily visible to the naked eye. Pursuing the issue of resolution, Mait Metspalu presented the puzzling fact that inferences based on frequencies of low-resolution markers may be far from the truth. For two decades, DNA sequence variation of haploid systems (mtDNA and Y chromosome) has been used to study human population history and phylogeography. Recombination-free uniparental mode of inheritance allows a true hierarchical genealogy to be reconstructed that can give insights into population origins and movements. His genetic and anthropological case study focused on the



Introduction

Munda speakers of India who share a substantial part of their Y chromosome pool with Southeast Asians. At the same time, no obvious trace of this Asian ancestry has been found in their maternal gene pool (mtDNA). When studying the mtDNA traces, his group was led to completely reconsider their initial interpretation. They reached the puzzling conclusion that looking at different resolutions of DNA sequence variation among populations can produce different, not just more detailed interpretations. Thomas Brandstetter approached the philosophical theme of Einfühlung by looking at the simple folding experiments of Wilhelm His, who wanted to find out how mechanical causes shape the development of the germinal disc. Brandstetter focused on two aspects: (1) how the appropriation of geological modelling techniques transformed the embryo into a spatial object that could be regarded as an ideal topological unit comparable to a mountain and devoid of any effects of scale, and (2) how this peculiar form of representation was provoked by an aesthetic of empathy instead of being effected by an epistemic visibility (Anschauung). Here Brandstetter asked in which way empathic mimesis contributed to a specific form of three-dimensional representation. An example of a repetitive drawing style was presented by Erna Fiorentini, who described the workflow leading to Ramón y Cajal’s highly sophisticated drawings. Given the special attention Cajal paid to the preparation of his specimens, he created an optimal basis for visually extrapolating the object’s properties. After a particular drawing was completed, Cajal adopted a strategy of longterm modification of the first images. The strategy meant reorganizing the image and setting it against new knowledge, which, conversely, influenced his current perception of both the specimens and their representation. In this way, the process of observation resulted in perceiving an ideal visualization, i.e., continuously “combining the images” that were produced from a more and more sophisticated understanding of nerve tissues. According to Matthias Bruhn, similar, if not exactly the same, visual concepts (perspective, linearity, and three-dimensional modelling) have been present and guided the formation of all two-dimensional representations of cells, tissues, or organs in anatomy and morphology. This claim applies to the intellectual modelling and communication of vital and spatial functions and processes and to more complex imaging technologies. He furnished his argument with a discussion of images of plant cell structures from around 1800 to 1830 in terms of an iconological and metaphorical texture. These illustrations, made using microscopy and reprinted in books, became the focus of a debate about the architecture of plants and the particular physiological differences between fauna and flora. Stepping back, Silver Rattasepp reflected on the relationship between perception and the nature of pictorial images. More precisely, he considered the nature of a simple image, such as a line drawing, and its creation by an organism whose perception of the world depends fundamentally on movement. Traditional approaches depict perception within the paradigm of eye-as-camera and stimulus-response theories, and consequently understand pictures as still representations, or snapshots of the visual field. If, on the other hand, we take seriously the enactive approach to perception, according to which perception depends fundamentally on movement, and add to it further elaborations on the nature of organisms as constantly on the move both temporally as well as spatially, the very nature of the act of creating images may need some rethinking. From a philosophical point of view, Marion Vorms analyzed the role of the variety of formats of representation in scientific theorizing by looking into the technique of genetic mapping as employed by geneticists from 1913 to 1934. 1913 is the date of the construction of the first linkage map by Alfred Sturtevant; 1934 is the date when observations of the banding patterns in the giant chromosomes of the salivary gland of Drosophila enabled geneticists to map the linkage map they had assembled by using purely Mendelian numerical techniques onto pictorial representations of chromosomes. Vorms focused on the debates among geneticists (mainly the debates between



Sabine Brauckmann and Denis Thieffry

Castle’s and Morgan’s groups and between Bateson’s and Morgan’s groups) concerning the way linkage maps have to be constructed and interpreted. According to her, an attention to the way representations are analyzed by various practitioners of genetics is a fruitful approach to classical issues concerning the identity of theories. Construing theorizing as a cognitive activity that involves manipulating and reasoning with concrete representations in particular formats enables philosophers to do justice to aspects of scientific practice that are neglected by traditional approaches most of the time. Norberto Serpente traced another interpretative scheme when decoding the visual change from the cytological tradition (microscopes) to the “molecular vision of life”. Working with the images in successive editions of The Molecular Biology of the Cell (Alberts et al., 1983-2007), he argued that the conceptualization of the new visuality emerging in cell biology in the early 1980s deserves a different approach from that offered by classical analysis. Two ways of thought bring a fresh and stimulating perspective to this imagery shift, namely semiotics and simulation theory, in particular the variant developed by Baudrillard connecting to the cultural context where these visual changes have emerged and unfolded. According to Serpente, this change in cell visuality created a sort of hyperreality in the sense of Baudrillard, an enclosed universe of images as signs without (physical) referents. The theme of RHETORICA concluded the workshop. First, Caitlin Berrigan opened up our perspective to art, using the images of viral capsids such as HIV, SARS, and hepatitis C that popular magazines like Newsweek and Scientific American feature as elaborately colored threedimensional graphic images. The role of visualizing disease and microbes in the popular imagination is critical in motivating public understanding of the invisible worlds of cellular phenomena. Imaging viruses brings the world of disease into the relational human realm, enabling the public and scientists alike to, in a sense, know their enemy. Further, Berrigan questioned the origins of these images by addressing questions such as how these images are made in a postmicroscopy era, what kinds of interpretations distinguish scientific endeavors from artistic ones, what our tolerance is for visual interpretation in both the scientific and artistic realms, and how this might have an impact on the public perception of disease. A special educational tool for the diagnostics of sex development was presented by Shelley Wall. She is currently developing a Flash Web module that uses dynamic scripting to visualize the hormone cascade leading to sex differentiation and its variants in the embryo. Diagrams of the hormone cascades that regulate human embryonic sex differentiation often take the form of a flow chart depicting two parallel tracks, female and male. Such diagrams may be used in student textbooks, for example, to explain “normal” sex differentiation, or in patient education to explain disorders of sex development (DSD). But there is a trade-off between a static flow chart’s clarity and legibility and its ability to depict the full complexity of genetic and hormonal interactions. Moreover, a two-track flow chart cannot represent the range of variation possible in sex development, except to depict DSDs as “derailments” of the normal process. Dynamic visualizations are not only potentially more information-rich than static diagrams, but they also reconceptualize the process of sex differentiation as a network of possibilities, rather than a strictly binary process. Turning to another object, namely proteins, as well as another audience (undergraduates and protein biologists), Laura Perini and Cheryl Kerfeld presented their visual example of ortholog neighborhood diagrams. Continuing the issue of “drawing inferences” when interpreting images or analyzing the visual content, they dealt with the different interpretations that students and experts draw from genome sequence databases. The growth of genome sequence databases presents a challenge to develop tools to analyze and represent data such that important features can be evaluated and that new patterns, which may emerge from visualizing large amounts of

10

Introduction

data, can be detected. Other features are elided. Novices and experts are nevertheless able to infer important information. Ortholog neighborhood diagrams provide a way to compare coding sequences in their genomic context to orthologous coding sequences in other genomes, and thus have the capacity to visually reveal functional and evolutionary relationships. Beginning with a single shell protein, they showed the ortholog neighborhood inferred from a BLAST search. Zooming in to one isolated line of the diagram, they explained how to read it and what undergraduates can understand from it with basic instruction. Finally they explicated the difficulty of drawing any conclusions from individual lines presented separately and discussed the use of images for research publications, where a series of figures often relate to each other. The reader comprehends the relationship between evolution and functionality by navigating between the text and the series of figures. Another aspect of how to communicate scientific knowledge through words and images Christiane Groeben had tackled upon at the Naples Workshop in 2007 where she concentrated on the watercolours and drawings of marine organisms at the Stazione Zoologica. Unlike universities, research institutions have different and more diversified needs to diffuse knowledge. They may offer assistance to patrons or organise services in the wake of their institutional policies. At a time when photography only started to become a scientific tool, the Naples Zoological Station institutionalised the contribution of artists to scientific illustrations of marine organisms. The aesthetic dimension of cells and their components was brought to a close by Sonia Schadwinkel, a scientific illustrator, who draws her three-dimensional images step by step manually with color pencils. When the figure is finished on paper, she scans it and assembles the hand-drawn cells and cell particles with computer software. In the completed image the cell’s elements are shown in the correct distribution, but the size of these elements are exaggerated to make visible their shape and composition. The selection of color for directing the attention of the viewer to specific cell elements is intentional, even when the illustrations are in black-and-white, e.g. in textbooks used by high school students to prepare for the university-entrance diploma. Schoolbooks and also science books for children simplify complex biological objects to ease the understanding of the image and its message by the non-specialist. Although computer graphics is nowadays a mature discipline, providing excellent software tools and already creeping out into the public and the fine arts, scientific illustrators still use hand-drawings when computer images become too complicated, some parts are simplified, e.g. cell membrane, others are extremely densely packed, as David Goodsell argued in Naples. A reason why manual drawing is still prevalent is that the illustration maintains its liveliness when hand-drawing and computer processing are both employed to produce an image. For, natural objects like cells that are depicted by using the computer alone look more like technical artefacts than organic beings, as Schadwinkel stated. Implicitly running beneath the surface of nearly all subject matter raised during the discussions was the theme of mental images, which guide the analyses and constrain what is seen and the extent to which the specimen becomes visible to the experimenter’s eye. Another pertinent theme was the handicraft underlying the fabrication of images. All participants agreed that drawings, figures, and diagrams relied upon repetition, recursions, and improvements when depicting the same biological objects over years, decades, even centuries until abstract forms of the diversity of life’s phenomena become molded by our hands and into our minds. Therefore, our hands play an essential part in imaging, in addition to the purely manual work of sectioning a slice of the specimen in question, arranging it, or drawing it. The impact of the tactile sense on visual representations even applies to the molecular level. Indeed, beyond the acknowledged demonstration and teaching usage, three-dimensional molecular models enable specialists to contemplate molecular structures, touch them, take them

11

Sabine Brauckmann and Denis Thieffry

in their hands, and turn them around to “feel” their molecular composition and organization, which otherwise remain invisible to the naked eye. Using current imaging techniques, the “whole” (embryo) can be visualized in delightful details, a point lucidly illustrated with optical projection tomography (OPT). A consequence of the quest to improve technology is the part the observer plays in visual representations, which either relates to the ideal of observation or to algorithmic procedures taking place inside a computer. Although the situation of “just pushing the button” is rather comfortable for the researcher, the disadvantage is that students partly lose the scientific literacy and become less able to interpret complex diagrams. Again, this affects the mental image, in a way. As puzzling as it may sound, computer imaging (or gene sequencing) has resulted in a revival of the (classical) method of observation, discerned as the most reliable source of knowledge and a visible basis of understanding. Helpful for this development was probably the fact that specific shapes and visual concepts created sign conventions (or icons). Thus, another issue that needs future attention – in addition to mental images and inferential reasoning – concerns the icons of biology: (1) icons of biological objects accomplished with accurateness, aesthetics, analyses, mental breakthroughs, or tacit knowledge, (2) icons of disciplines that either established new facts or calcified the canonical knowledge of the discipline in question, or (3) icons produced by specific technical tools such as electron microscopy, imaging software, model building, or optical tomography imaging.

12

Acknowledgments

Acknowledgments

Our brief precis of the Berlin Workshop on images in the life sciences will not be complete without a special thank to the Max Planck Institute for the History of Science, in particular to Hans-Jörg Rheinberger for being the generous host of our meeting, to Antje Radeck, Kirstin Müller, Birgitta von Mallinckrodt, Nuria Monn, Mirjam Lewin and Hartmut Kern for all their organizational support and to Angelika Irmscher, Jan Bovelet and Daniela Kelm for helping to transform our manuscripts into a readable preprint. The Berlin Workshop was financially supported by grants of the Volkswagenstiftung (Germany), Tartu University (Estonia) and the Konrad Lorenz Institute for Evolution and Cognition Research (Austria).

13

I Forma Dimensions of Embryos, Cells, and Molecules

The Problem of Assessing the Dimensionality of Information: Cautions from a Mini-Case Study of Differences between (Biological) Information and Standard Calculations of Hereditary Information Richard Burian

In keeping with the spirit of our enterprise, this presentation will be brief and provocative. I will not present any images, but I will bring up a critical issue bearing on the adequacy of representations to specific purposes. The issue of concern is not the accuracy or faithfulness of images as such, but the need to assess the dimensionality of the information needed for the purposes at hand. If we aim, for example, to represent a developmental process or genetic information, we need to assess the dimensionality of the process or the relevant complexity of the information. That is, we must suit the representation to the key variables required for proper understanding of the complex objects or processes under consideration. At heart, I want to ask a philosopher’s question, one I do not pretend to know how to answer. The question is how to assess what a given visual representation cannot accomplish. In other words, what sorts of background information is needed, but not conveyed or conveyable by the image or representation? Or, perhaps better stated, how, in the relevant context of a scientific illustration or representation, under the relevant interpretative conventions, should we understand the limitations on the dimensionality of the information contained in, or constructable from, a given representation? This is critical, because we cannot expect a representation to yield adequate understanding of a concept, problem, process or entity sufficient for a purpose in hand unless we can extract sufficient dimensions of information to handle the problem, entity, or process in question for that purpose. This problem is way above my head. But by starting from a concrete instance, I can give you some sense of what I am on about. For this purpose, lets play a bit with the concepts of biological information and hereditary information. For concreteness, consider the now-classical concept of hereditary information employed in genetics for much of the last half-century, a concept that has been the focus of much difficulty. I have a tentative, partial diagnosis of some sources of the chaotic situation that has arisen in this connection, one that touches on the biases of the (visual and abstract) representations that have been used to deal with both information and heredity. I will suggest that these problems trace back, in part, to some important biases and limitations of the representations, or systems of visual representation, that we have used. The starting point is the standard one-dimensional representation of genetic information embodied in the genetic code as a sequence of nucleotides. I will develop a few elementary points about the genetic code and the allied concept of information. It is clear, I claim, that genetic information, thus conceived, is insufficient for handling, for example, the development of organisms or the mechanisms involved in inheriting standard features of organisms, say pattern baldness or susceptibility to cystic fibrosis in humans. The problems caused by the insufficient dimensionality of the classical hereditary or concept of genetic information for these purposes should serve as a toy model for posing the difficult problem of assessing the dimensionality of problems of real interest to biologists in such areas as development, evolutionary developmental biology, and, for that matter, molecular understanding of gene expression. This should be enough to suggest how to generalize some issues about the dimensionality of information required for dealing with problems of other sorts. I believe (perhaps controversially) that these problems feed back in strong ways on a number of issues about adequate visual representations of the causal or

17

Richard Burian

informational basis required for resolving developmental, physiological, evolutionary, and other similarly complex problems. Not too surprisingly, my example of genetic information is strongly connected to Francis Crick’s Sequence Hypothesis and his version of the Central Dogma. Bruno Strasser’s lovely paper, ‘A World in One Dimension: Linus Pauling, Francis Crick and the Central Dogma of Molecular Biology’ (Strasser 2006), started me thinking along these lines. Crick’s two doctrines got compressed and solidified in molecular biology into the supposition that linear sequences of nucleotides are, as such, the bearers of all things hereditary and that, in context, the sequence of amino acids in polypeptides did most of the work in determining the functions of all biologically interesting molecules. Crick’s substantial claims were overinterpreted by early molecular geneticists (largely unconsciously, I think) to set aside other molecular properties (e.g. three dimensional conformation, temporal changes in conformation, and other properties) when they were thinking about hereditary information. They could get away with this as long as the order of amino acids could be inferred safely from sequences of nucleotides and the sequence of amino acids sufficed to specify conformation and function of proteins. Yes, molecular biologists were always aware of some exceptions when it came to protein conformation see allostery, for example but many of the exceptions could be explained away by signaling devices that triggered interactions between two or more molecules yielding specific, fixed, alterations in allosteric molecules. But during the last twenty or thirty years, it has become clear that we cannot, in general, simply read off amino acid sequence from nucleotide sequence, and that there is much more than automatic self-assembly involved in the conformations assumed by the proteins that result from a sequence of amino acids. Crick, himself, was far more cautious in his original formulation of the central dogma than this suggests. He was absolutely clear that we did not have adequate understanding of how proteins are assembled from polypeptides, i.e., how cells get from a sequence of amino acids to a functional protein with a specific conformation and specific activities. What he wrote indicated only that it appeared that geneticists could, in general, abstract from that issue since some mixture of selfassembly and physiological or biochemical control took care of that, with univocal results in most circumstances. Thus, heuristically, geneticists could ignore that problem. Unlike the Sequence Hypothesis, therefore, which was already implicitly accepted by workers in the field, but which is daring and substantive, Crick treated the Central Dogma as a kind of strong heuristic principle or working hypothesis. It bypassed the then-leading model of protein or enzyme formation, which went under the name of enzymatic adaptation and was given specific biochemical content by Linus Pauling’s view of template formation of proteins well into the 1950s. According to Pauling, an indifferent substrate, which might accept different conformations and might well have a variant chemical composition, received its active conformation by a 3-D templating mechanism. This was based on the lock and key theory of antibody formation applied to the formation of enzymes, whence the name enzymatic adaptation. For our purpose, the crucial difference between this and the Central Dogma concerns what is required to provide genetic information a three-dimensional shape. The collapse of the enzymatic adaptation and of Pauling’s template model of protein formation via the striking demonstration that sickle cell anemia as resulting from a single amino 



18

This point of view was commonly treated as a straightforward extension of the Weismannian tradition that heredity and hereditary information are restricted to what can be transmitted through the germ line, applied to the discovery that DNA is the genetic material. See molecular pleiotropy, epigenetic inheritance, the histone code, the compatibility of a fixed polypeptide sequence with many (occasionally quite distinct) protein conformations, gene sharing, chaperoning, RNA interference – the list of relevant ways in which nucleotide sequence-protein correlations break down is quite long.

The Problem of Assessing the Dimensionality of Information

acid substitution encoded by a single nucleotide change and similar such successes made it possible for geneticists to elide the higher dimensionality required for conformational information and to collapse the notion of information to a single dimension. As this new concept of genetic information became the norm, biologists quite generally lost sight of the collapse of dimensionality built into the new concept of (genetic) information as compared with the old. That, I argue, was one source of major problems built into the presumptions of molecular biologists about hereditary specification solely within the genome of all of the hereditary properties of organisms. There are, thus, two principal components to the central dogma. The first is that sequence information specifies, somehow, in ways eventually to be determined, to specify proteins in cellular contexts. The second is that sequence information does not get back to nucleic acids from proteins. The fundamental issue raised here does not concern the three dimensions of space or the four dimensions of space and time, but, rather, the sort of dimensionality analysis required to understand the patterns and causal dynamics of biological processes, e.g., the developmentally shifting readouts of a given nucleotide sequence. The concept of biological information, and the (perhaps narrower) concept of hereditary information clearly cannot be adequately represented by a one dimensional sequence (or a sequence-matching system with a fixed rule for readout) once intra-organismal variability of nucleotide readout is recognized. Assigning the appropriate dimensionality is difficult and contentious, but it is clear that there are cellular, physiological, organismal, and environmental components in the relevant information. In the end we need to develop something like a dimensional analysis of the sorts deployed in the physical sciences to evaluate the dimensionality of the hereditary information required to resolve particular problems if we are to solve this conundrum. Given the differences between the hereditary systems of prokaryotes and eukaryotes and the variability of readout in eukaryotes, I suggest that there will not be a uniform answer from organism to organism or problem context to problem context about the dimensionality required to handle the concept of hereditary information. But I do not have a good suggestion at this point for pursuing this problem systematically. We have here something like the dialectic that David Gooding referred to (Gooding 2004). We construct contextually adequate (or nearly adequate) simple models of information, push them until complexities not taken into account reveal serious inadequacies in our representation of the concept of hereditary information, and then go back to the drawing board to construct and/or just try out alternative representations. What is going on in the case of hereditary information, I believe, is that organized biological systems, with enormously information-rich structures, in highly specific circumstances, use linear sequences of nucleotides as triggers for specific constructional processes more-or-less appropriate to the circumstances. The biological information contained in those biological systems in turn composed of organic structures and environmental regularities is what was bypassed in the concept of hereditary information. As long as the background information about the biological system could be set aside by sticking to appropriate contexts, the triggers could be ignored and the hereditary information that was used in a roughly 1:1 way to yield proteins could be counted as all of the relevant information. In short, the hereditary information was the minimal information used by the information-rich biological system to construct the outcome supposedly encoded by the hereditary information. When the context was appropriately set (which turned out to be largely in prokaryotes), this gave a sufficient account of what to expect as output from a measured input. But this was accomplished by abstracting completely from the biological information built into the organism in virtue of which the transformation from nucleotides to proteins was accomplished. Once one turned to problems of development, to differential gene expression, and such problems, hereditary information was radically insufficient to solve the problems posed.

19

Richard Burian

I suspect that there is no way in sight for us to capture the full complexity of the biological information transmitted across generations with an analytically tractable representation. But there are elegant ways of capturing what we need for certain purposes and for learning something about the contextual limitations on each such representational scheme. The issue here is to characterize the circumstances and problems for which we can shrink the dimensionality of relevant information for the purposes in hand to a manageable and tractable number of dimensions. The brilliant success of Crick’s Sequence Hypothesis plus central dogma depended on the exceptional simplicity of the questions asked and the systems employed, a simplicity not likely to be found very often. What is difficult, I believe, is that we have no systematic way of drawing boundaries around systems in ways that reproduce that simplicity. The model to take to heart here, I think, is that of dimensionality analysis in physics. An analysis of the dimensionality of the state or process under investigation, or of the outcome of interest, compared with the dimensionality of the antecedent conditions, or the conditions that are allowed to vary among the antecedent conditions, provides some sort of initial guidance about what is required for a solution of the problem. This might be one way of getting at a crude measure of what is being sought in the dialectic that Gooding described. The relevant information for correlating nucleotide sequence with protein structure is much greater than that contained in nucleotide sequence alone. And this is still far less information than is required to understand how to get from DNA to most of the phenotypic properties of interest in organisms. Exactly how much greater how many dimensions (in the sense of a physicist’s analysis of dimensionality) are required is very far from obvious. So far as I know, the problem of dimensionality as I have posed it is not at present analytically tractable. But it is obvious that some of the necessary information concerns the immediate molecular and cellular environment, perhaps including certain aspects of the history of the relevant cell(s) for controls governing the processing of RNA molecules, the conformation of strings of amino acids and other matters affecting protein activities and interactions. What I hope I have shown (or better, what the intuitions I hope to have motivated) is that these problems involve extremely high dimensionality. And the dimensionality depends in crucial ways on drawing boundaries between what to count as background, what to count as relevant objects, processes, states, and changes of state, and what to count as antecedent and stable conditions. This is part of the articulation of the theoretical framework within which to work. We will need to work back and forth between high dimensional representations, compressed for tractability into three and occasionally four dimensional images, to provide information with many more than three or four dimensions. How many dimensions are needed for a realistic analysis of a particular problem? How can we compress the representation of that many dimensions into a tractable, readable two three, or four dimensional format? How can we set up the conventions of our representation so as to compress the information with minimal loss? How can we use expanded representations, incorporating more dimensions than three axes of static visual space or the four of space-time to convey integrated information of the several additional dimensions needed for simple problems? These are the sorts of open-ended and difficult questions that we face in most of the ongoing forefront work in fundamental biological research. From this perspective, one key issue is this: How can we properly assess the dimensionality of biological information required to solve particular problems? Are there procedures that will let us black box complexes analytically in a way that corresponds adequately to causally relevant modules to simplify the dimensionality of the problems we face? And how can we best accomplish such black-boxing? The intuitive considerations put forward informally above suggest that the dimensionality of (the standard abstract and visual) representations of informational content that are most familiar is insufficient for many of the task we ask of it. In particular, the process by

20

The Problem of Assessing the Dimensionality of Information

means of which information is acquired in the time course of development (appropriately restricted e.g., by processes within the egg and embryo before eclosion or hatching or birth?) needs to be taken into account in representing information and heredity. Having raised a mares nest of puzzles, I fear I must stop without much by way of constructive suggestions. But it might be appropriate to close on a slightly more optimistic note, by closing with a quotation from Gerd Müller, writing about the current state and accomplishments of evolutionary developmental biology. Evo-Devo also prompts the development of new analytical tools. One example is the computational analysis of developmental gene expression and its evolutionary alteration. Since developmental evolution has been shown to reside less in the establishment of new control genes than in the modification of the dynamics of gene, cell, and tissue interactions, the precise topology, timing, and quantity of gene expression as related to changes of cell behavior and tissue properties becomes a target issue. This requires tools for the proper representation and quantification of three dimensional (and ultimately four dimensional) gene expression in developing embryos in order to determine the exact differences in spatiotemporal gene activation that are associated with phenotypic variation and innovation. (Müller 2007, pp. 508-509)

Perhaps work along the lines adumbrated by Müller will provide the analytic tools to yield a partial solution of the problems I have raised in this provocation.

21

Richard Burian

Bibliography Gooding, David C. 2004. “Envisioning explanations the art in science”. Interdisciplinary Science Reviews. 29 (3): 278-293. Müller, Gerd B. 2007. “Six memos for evo-devo’’. In Manfred D. Laubichler and Jane Maienschein (eds.) From Embryology to Evo-Devo: A History of Developmental Evolution. Cambridge, MA: MIT Press, p. 499-524. Strasser, Bruno. 2006. “A World in One Dimension: Linus Pauling, Francis Crick and the Central Dogma of Molecular Biology’’. History and Philosophy of the Life Sciences 28: 491-512.

22

Building Simple Two-Dimensional Molecular Models to Illustrate a Complex Subcellular World Costis Papanayotou

The field of developmental biology studies the development of embryos from conception to birth. Initially this was the job of descriptive and experimental embryologists, but the advances that have been made in the life sciences over the last 20 years brought embryology, genetics, and molecular biology together into this new discipline that tries to explain how genes control different developmental processes. Embryos are complex three-dimensional entities that change over time, their different parts interacting with each other, reforming and refining themselves to produce progressively more complicated structures. Visualizing these changes was a big challenge even for early embryologists. As the field evolved, new challenges were added. Developmental biologists not only have to present the results of their experiments, they also have to describe the experiments, which are often intricate and incorporate embryological, genetic, molecular, and biochemical techniques. The results, which are usually photographs of specifically manipulated and processed microscopic embryos, often give insights into the function of genes and therefore reveal a subcellular world that is invisible yet dynamic and real. Such conclusions are usually represented by simple twodimensional drawings of interacting molecules. Over the years, a language of lines, shapes, arrows, and bars has evolved, which, although not very strict, is nevertheless intelligible to all developmental biologists.





Figure 1: A headfold stage embryo processed by in situ hybridisation showing the expression pattern of the definitive neural plate marker Sox2 , copyright by Costis Papanayotou.

23

Costis Papanayotou

The development of the whole embryo is an extremely complicated process. As a result, researchers break it up into small, more manageable areas of investigation. One such is the formation of the neural plate, which in subsequent stages will become the central nervous system. The neural plate develops from a region of the ectoderm under the instruction of a specialized structure called the embryonic organizer and is marked by the expression of a specific set of genes, of which the most important is Sox2 (Figure 1). As the neural plate will give rise to the brain and the spinal cord, the study of its development is of special interest and so is the study of the genes that characterize it, most notably Sox2. The regulation of Sox2 expression was recently studied in the chick embryo and can be used as an example of how developmental biologists tackle the challenge of presenting their research in a visually arresting, simplified, and comprehensible manner.



Figure 2: Schematic representation of the electoporation process: The embryo is electroporated at the early gastrula stage: shown in green is the DNA solution introduced into the embryo by electric current (thunderbolt). The embryo is grown until the headfold stage and is subsequently processed by in situ hybridisation for Sox2 (purple) to assess the effects of the manipulation in neural development and by antibody staining for GFP (brown) to visualise the electroporated cells. Copyright by Costis Papanayotou.



Figure 3: Embryo electroporated with Geminin in the extraembryonic epiblast, incubated overnight and processed by in situ hybridisation for Sox2 (purple) and by antibody staining for GFP (brown). The electroporated extraembryonic cells (stained brown in B) express Sox2 ectopically (purple staining in the electroporated region in A). Copyright by Costis Papanayotou.

There are two main approaches to studying the function of a gene. Using the gain-of-function approach, the gene studied is introduced in a region of the embryo where it is normally not

24

Building Simple Two-Dimensional Molecular Models to Illustrate a Complex Subcellular World

expressed, and the effects of this manipulation are studied in this ectopic area. Chick embryologists can do this by electroporation, electric current applied over the area of interest opens holes in the cell membranes and pulls a DNA vector expressing a specific gene into the cells. The DNA vector also expresses the gene producing the green fluorescent protein (GFP) to mark the electroporated cells. The embryo is incubated until the desired developmental stage and the effects of the manipulation are subsequently studied, usually by in situ hybridization, a technique that stains cells expressing a specific gene, followed by staining with an antibody against GFP to visualize the electroporated cells (Figure 2). For example when Geminin, a gene that is normally expressed in the neural plate, is introduced in the extraembryonic region of a developing chick embryo, it induces expression of the definitive neural plate marker Sox2 in the electroporated cells (Figure 3). This experiment suggests that the normal function of Geminin in the endogenous neural plate is to activate Sox2 expression. This approach is complemented by loss-of-function experiments where a gene is inhibited in its endogenous domain of expression; it is known from studies in the frog embryo that when expression of Geminin is blocked in the neural plate, Sox2 is downregulated. This inhibition can be achieved with different means and at different levels: morpholino oligonucleotides (small modified RNA molecules) block expression of specific genes while dominant negative forms of specific proteins block the function of these proteins. Another example of a loss-of-function approach to studying the regulation of Sox2 concerns Brahma, a protein that binds to a regulatory region of Sox2 called N2. N2 was known to regulate expression of Sox2 in the neural plate and that piece of information made Brahma relevant to this study. When a DNA vector expressing a dominant negative version of Brahma is introduced into the neural plate of a chick embryo by electroporation, Sox2 is downregulated in the electroporated cells (Figure 4). This result suggests that functional Brahma is required in the neural plate for expression of Sox2.



Figure 4: Embryo electroporated with dominant negative Brahma in the prospective neural plate, incubated overnight and processed by in situ hybridisation for Sox2 (purple) and by antibody staining for GFP (brown). The electroporated neural plate cells (stained brown in B) do not express Sox2 (lack of purple staining in the electroporated region in A. Copyright by Costis Papanayotou.

25

Costis Papanayotou

These and many more similar experiments point to a molecular model that describes how Sox2 expression is regulated in the neural plate at the time of its formation: Brahma, a protein ubiquitous in the early chick embryo, is bound to the N2 enhancer element of the Sox2 gene. This is the regulatory region responsible for Sox2 expression in the neural plate. Brahma bound to N2 can activate expression of Sox2, however the heterochromatic protein 1α (HP1α), a known repressor of transcription, binds to Brahma and blocks its function as an activator. In the very early embryo, long before the neural plate forms, the fibroblast growth factor (FGF) induces Geminin, which replaces HP1α on Brahma, converting it again into an activator. FGF also induces another protein called ERNI, which binds to Geminin and recruits onto the N2 regulatory region HP1, another repressor related to HP1α that cannot bind to Brahma directly. As a result, Sox2 remains silent. At the time when the neural plate forms, a small gene called BERT is upregulated in this region. The protein disrupts the interaction between Geminin and ERNI, displaces the HP1α repressor from the complex, and allows Brahma to resume its role as an activator of transcription and to induce expression of Sox2. This model can be presented as a series of drawings where lines represent the linear DNA molecule, blocks on these lines represent the genes and their regulatory regions, and different interlocking shapes symbolize proteins that interact with each other and with the DNA. Arrows denote activation and bars represent repression while the succession of the drawings introduces the element of time (Figure 4).



Figure 5: Two-dimensional drawing representing a molecular model that summarises results as the ones presented above and explains how different proteins (like Geminin and Brahma) interact with each other to regulate expression of the definitive neural plate marker Sox2 and confer a neural character to prospective neural plate cells. Copyright by Costis Papanayotou.

Such models often summarize conclusions from a big set of data and should therefore be visually arresting and memorable. They should contain all the relevant information to make them coherent and understandable, but on the other hand, they should be economical as cluttered models can be confusing. As a result, in the process of building such models, a lot of information is discarded, leaving behind only the pieces that are most pertinent to the undertaken study. In the above example, Brahma is the catalytic subunit of a big protein complex comprising six to twelve more

26

Building Simple Two-Dimensional Molecular Models to Illustrate a Complex Subcellular World

proteins, none of which is depicted in Figure 4. Its function is to relax the chromatine and make it accessible to transcriptional activators that bind to the regulatory regions of specific genes and activate their expression. In the case of Sox2, these activators are already expressed in the prospective neural plate. On the other hand, HP1a proteins act as oligomers that bind to chromosomes and compress them into inactive chromatine, making the regulatory regions of genes inaccessible to transcriptional activators and thus repressing the expression of these genes. None of this is included in the model. However these missing pieces of information are well known from previous studies and at the same time not crucial to the understanding of the model. Incorporating them would make the model less clear and intelligible. In short, developmental biologists have found ways to communicate their research visually in memorable and comprehensible ways. The methods used are not without their shortcomings and limitations, but they are often the best available to depict a complex molecular process in a simple and clear manner.

Discussion Thieffry: I was wondering if you would consider this kind of diagram as a projection of all the accumulated knowledge – a projection in space – that is the pertinent one, the relevant one for the specific question that you aim to address. If so, it would mean you are especially interested in the roles of these genes because you have identified them as playing an important role in the process you are studying. You don’t really care about all the co-factors. These genes are the important ones for your study, and you want to show how they map into a morphospace scheme, but discarding all other actors. Would you see your diagram as a projection? Although you know that the molecular reality is quite different? Papanayotou: In general, it makes sense to simplify. I wouldn’t be able to draw everything, and even less to make understandable. It’s too complicated. Thieffry: No, it is still possible, in fact, you could do it, but it would be almost impossible to understand what your point is by showing the complex scheme, right? Papanayotou: You have to concentrate on what your question is, and the answer to this question – I guess I did show that – and to leave outside what is not pertinent to your model. Fiorentini: I found it really striking what you were showing. You first showed the film, which is an appearance of the process going on inside, and then a two-dimensional model which is abstracted from the thing. It’s your knowledge – it’s not the thing – but a process. And this is, although a two-dimensional one, a mental dimension that you made visible. Papanayotou: In a way, that is true, at least for the whole embryos and the cells. This mental dimension was formed by experiments like the ones I presented. We wanted to understand what is happening inside the cells, inside the nucleus, how things work, and then go back to the embryo itself where new forms appear based on differential expression of genes like Sox2. Sharpe: I guess a lot of discussion is about what is the difference between your two different kinds of representation. I mean, to my mind there is always one very useful and simple way to distinguishing things. For any kind of mechanism, there are state variables and systems parameters – state variables that change over time as that process goes along, whereas the parameters are invariable. In your case, it is more of a topological thing, you don’t have numerical parameters, but you have a mechanism, that is behavior. So, the movies are the behavior, are the state variables. A state variable cannot be computed with system parameters. System parameters are things that are constant, they don’t change over time. So, to a logical stand, this issue of projection – how you

27

Costis Papanayotou

describe the difference – seems, maybe, to be a bit like that. One is the state variables running over time, and the other is the parameters of the system that are fixed; they define the mechanism. Sgaier: A sort of a comment, in terms also of Erna’s – you’re schematic about the gene expression. There you try to visualize something we have never seen, actually, we don’t know [what] it looks like. So, it’s totally our imagination and interpretation, what we think things look like. We have never documented visually gene expression, other then by seizures in terms of gene expressions at a very molecular level. Papanayotou: That is true. Again, I think what I tried to do when I drew these circles and polygons was to create a figure that would stick in various minds.

28

Imaging Fate: Tracking Cell Migration in the Developing Embryo Sema K. Sgaier

I started my voyage into the realm of fate as a geneticist interested in determining how our cerebellum, the region of the brain that controls movement, develops into an elaborate 3dimensional structure from just a handful of identical cells during its development. This question was fascinating because the pattern of the folia were conserved throughout evolution and regionally divided into anatomical and functional sub-regions. So how did these anatomical subdivisions and intricate pattern of folia arise, given the elegant simplicity of the embryonic cerebellum? And how could one visualize the morphological evolution of the cerebellum at the genetic, cellular and anatomical levels? This question could be answered by following the fate of the mother cells, set-aside during early development to form the cerebellum, and their daughter cells using advanced genetic and visual techniques. What we needed to do was to determine how these cells behaved during fetal development where the brain’s environment was in a constant flux of mechanical forces and factors such as growth factors. Was each cell pre-destined to reach a fixed destination or was this decision dictated by the environment that they encountered? We also needed to evolve a sophisticated visual tracking mechanism to allow us to track the development. We developed a novel and powerful mouse genetic technique (Inducible Genetic Fate Mapping) whereby a cell could be labeled at the desired time in the developing embryo and followed indefinitely throughout all stages of development by changing its genetic composition. With the appropriate use of promoters we marked specific medial and lateral regions of the murine cerebellum and derived a positional fate map of the murine cerebellum. Through defined genetic manipulations we could follow cells from any region of the cerebellum and see how this magnificent 3-D structure developed. The results were fascinating! Contrary to expectation, we found that a cell’s final destination was determined depending on its original position (see Figure 1). Thus it was fated or preprogrammed to contribute to a sub-portion of the final elaborate structure and this fate was replicable in every animal analyzed. Our studies illustrated that each cell took a specific path whereby their movement as a group caused major morphological changes in the developing cerebellum. We show that progressive changes in the axis of the cerebellum underlie its genesis. My genetic fate mapping work leads to another challenge – a visual challenge. How do we visually create fate-maps of the brain? How do we visually explore and represent the trajectory of brain development, which occurs along the four dimensions – the spatial axes (x, y, z) and the axis of time (t)? The most obvious representation of all axes including time is a moving image-video but often such representations are not technically feasible or limiting in the information they provide. The challenge remains representation of a complex unfolding 4-D narrative on a simple 2-D page. This body of work draws on my scientific findings entitled Morphogenetic Movements that Shape the Mouse Cerebellum: Insights, and explores both the technical advances in fate mapping as well as imaging to accurately represent cell migrations covered by genetic codes at varying scales.

29

Sema K. Sgaier



30

Figure 1: This is a series of images of a mouse embryo taken at different time points in development. The series successfully show how the morphology of the dorsal neural tube is changing and in parallel to this the fate of genetically marked cells (blue), initially residing at the most anterior region (indicated by right arrow in A), copyright by Sema K. Sgaier.

Imaging Fate: Tracking Cell Migration in the Developing Embryo

Discussion de Chadarevian: I am interested in the connection of your genetic fate mapping and its visualization because it doesn’t seem to be arbituary. The way you study this it seems a bit like ‘I do these experiments, and then I do have a protocol’ – is it really that way? And then, of course, we come back to the history of those applications. You know there is a tradition in representing cell movements and stuff like that. Do you cut back to those, or do you really sit there and say, ’now I have to invent the model’? When do you come up with the models? Sgaier: Ok, I’ll start with your first question. No, they are not unlinked, they are not binary systems, obviously, quite a lot of the experimental model is gathering the data, and, as you know, all about the data is the visual. So, there is what techniques you choose to basically look at your data, in terms of microscopes, and the pictures you pick out. I think when I was talking about the visualization, it is the simplification of the data. In terms of your second question ’do I simply invent everything, or is there some sort of pre-pattern, or pre-set up rules’ – what I can say is that when I was doing these models – this is the work of 10 or 5-6 years –, and when doing the scientific work you always present your data, you have a little bit of data, you present a little bit of data, and so on. These models came up from reiteration and reiteration, presenting data, people are understanding them, coming back to you, you reinterpret your models, it is a constant development. And, yes, of course I do look at the published data and published images, but at the same time I am trying to come up with something that suits this project or this problem in specific. So, a lot of these images are completely hand-drawn, and then they are all put together. Schadwinkel: I think that is the point. I have another view about it, I am not involved in the research, I just see it and ask myself ‘do I understand it?’, and if not, how can I make it more understandable, more conceivable. And you show it in a good way. With the rows you just get in a sequence of time – there is a start, there is an end –, very simple, a few simple tricks to get it. And it is easily understandable because you know where it is. First you do show the microscopic things, then you take in the shape, the schematical thing, and everybody does know it has to do something with the microscope because it has the same shape, then you get the colors in. Of course, we have to show things that we can’t really see, like genes – there you get a dot, a schematic imagination, it’s enough, whether square or something else, that doesn’t matter. But when you have this comparison between the real thing and the schematic thing, then it is easier to follow for almost everybody. Of course, there always remains something too difficult to understand when you’re not in the research. Bruhn: It’s actually showing that something is split and the next is, I think, you would call an evolution of a process – we have an arrow that is indicating that there is A, then B, some series, and within the brain images you have little arrows indicating that there is a motion as you have shown in this image [on the slide below, 3rd one]. I mean science is now able to mark cells, and it is not only that we draw after nature, we draw in nature, within nature. It is unbelievable what is possible today. On the other hand, if we want to draw this process in a diagram we don’t just use colors, but we use arrows again. So, we do not only rely upon, let’s say, the strength, the power of color to indicate ‘here is a cell and there is a cell’. We need something to stress it again. And I wanted to point to it, I think we come back to it later, just to keep this image in mind.

31

Pander, d’Alton and the Representation of Epigenesis Stephane Schmitt

I would like to introduce the general question of the representation of epigenesis (i.e., the passage from a formless to a formed being) in the early nineteenth century. For that, I will choose the example of the embryological work and plates of Christian Heinrich Pander and Karl Ernst von Baer, published in 1817 and 1828, in order to show the problems scientists met when they wanted to illustrate epigenetic development.

Preformation and epigenesis around 1800 First a historical sketch. In the early nineteenth century, two kinds of embryological theories were in direct opposition. According to the first, which we can call the theory of preformation (some historians prefer the term ‘preexistence’; see Roger 1993), a complete embryo, with all parts of the adult, was already contained in the sperm (animalculism) or, for most scientists, in the unfertilized egg (ovism). This small individual contained eggs (or sperms) that contained smaller beings, etc. This theory had prevailed since the late seventeenth century for a variety of reasons, but since the mid-eighteenth century the rival theory of epigenesis had been gaining ground, which argued that the individual was not preformed in the egg (or the sperm) but formed gradually from initially unorganized matter. Epigenesis gathered a renewed favor in the second part of the eighteenth century. In particular, Caspar Friedrich Wolff played an important part in this evolution. In his Theoria Generationis (Wolff 1759) and De Formatione Intestinorum (Wolff 17681769), he argued that the embryo was built from a formless fluid. He was the first to glimpse the role of membrane folding in the formation of the organs of the fetus, and for that reason his theory was sometimes considered a forerunner of the germ-layer theory. In fact, there was no real germ-layer concept in his view, but the role he ascribed to embryonic membranes in the formation of organs (for example intestines) certainly influenced other embryologists. The theory of epigenesis consequently became more and more widely accepted in Germany in the late eighteenth century, not only among scientists, but also among philosophers such as Kant or Schelling, so much so that when Pander began his study, preformation already seemed outdated to many. There is no doubt that this context guided him in his observations and in turn ensured the success of his publications.

The work of Christian Heinrich Pander (1794-1865) Pander was born in 1794 in Riga. He went to the University of Tartu in 1812 to study medicine and there met Karl Ernst von Baer, with whom he struck up a friendship. Subsequently Pander went to Germany to complete his training in one of the prestigious universities of this country. The famous physiologist and anatomist Ignaz Döllinger was then looking for a student to undertake a complete description of the early chicken development. Pander accepted the position and began working. He was assisted by the draftsman Eduard d’Alton (1772-1840). The three men worked together for several months in close collaboration. Pander opened the eggs and made the observations; d’Alton drew from life under his supervision. Döllinger gave some advice on the methodology. The results were presented in Pander’s doctoral Dissertatio in Latin in 1817 (Pander 1817). The plates engraved by d’Alton were printed the same year, with a German text.

33

Stephane Schmitt

So Pander, Döllinger, and d’Alton worked together tirelessly for many months. More than two thousand eggs were incubated between 35 and 40°C, then opened and observed at every stage from the laying to the fifth day of incubation. The three men repeatedly compared their opinions and interpretations to minimize the risk of error. As we know today, development takes place at the surface of the yolk, and when the hen lays her egg, the developing embryo is a very thin and delicate membrane (Pander named it the “blastoderm”) that can be injured when the shell is opened. According to Pander, his work was purely descriptive, but in fact the theoretical implications of this study were considerable. The method itself was meaningful: in contrast to their predecessors, including Wolff, they did not start by describing already-formed structures in the late stages of ontogenesis in order to work backwards to the earliest development. They just presented what they observed at a certain point without prejudging the fate of each part. The interpretation of certain embryonic structures revealed Pander’s unambiguous adherence to epigenesis. For example, in the yolk of the egg, whether it is fertilized or not, there is a clearer zone, just beneath the yolk membrane, that approximately corresponds to the position of the future embryo. Today this “nucleus of Pander” (not a nucleus in the modern sense) is interpreted as a plug of whitish yolk with no particular significance for development and whose function is purely a nutritive one, like the rest of the yolk. Nevertheless, it was a subject of debate for physiologists for many centuries, and many supporters of preformation saw it as the germ of the fetus. One widely held idea was that the embryo, initially buried in the yolk, emerged from it in the course of development, so that it gave an impression of a gradual formation, although in fact it was present from the very beginning. As regards the membrane around the embryo, it was considered a kind of amnion, pushed away to the outside when the fetus grew. In his own description of the egg at the beginning of incubation, Pander established that the “nucleus” existed in all eggs, fertilized or not, and that it was a simple local differentiation of the yolk. He wrote: “The nucleus, which appears as a white mass, [...] participates in the whole transformation of the yolk, loses its shape and gradually disappears around the seventh day [of incubation].” On the other hand, the membrane only arises in fertilized eggs and is “a single layer” made up of the smallest granules visible with a lens and resembling a thin disc (Pander 1817b: 20). He added: As for the membranous layer, it is of the greatest importance during the chick’s entire development. For the embryo chooses this layer as its seat and domicile, and, further, its substance also contributes greatly to the configuration of the chick, we shall therefore in the future call it the blastoderm. (Pander 1817b, p. 21)

Thus, Pander realized that the embryo is the membrane itself. This idea was probably the most daring one in his books, and he emphasized it on many occasions. He said, for example, that the blastoderm had to be carefully isolated before observation, “since all transformations of the fetus arise in and by it.” Furthermore he saw it as useless to use a high-power lens to observe the early stages, “for the blastoderm is still uniform throughout the germ zone” (Pander 1817, p. 9). Pander showed that the delamination of the blastoderm came next, after twelve hours of incubation, resulting in the formation of two lamellae: the “mucous membrane” (or endoderm, in modern terminology) facing the yolk and the “serous membrane” (or ectoderm) facing the shell. Afterwards the serous membrane undergoes a new delamination and a third layer, the “vascular membrane” (or mesoderm), appears. This was the first formulation of the germ-layer theory. According to Pander, the folding of the layers was not a passive phenomenon; for example, when describing the figures in the German edition, he wrote:

34

Pander, d’Alton and the Representation of Epigenesis

[They] do not represent dead membranes, whose folds, mechanically formed, would necessarily extend over the whole surface and would not be restricted to a given zone; such a view would inevitably lead to erroneous ideas. The folds that cause the metamorphosis of the membranes instead have a spontaneous organic origin, and they form at the proper place, whether it is by an expansion of the vesicles already existing or by the emergence of new vesicles, with no modification of the rest of the blastoderm. (Pander 1817b, p. 40)

This text attests to Pander’s epigenetic conception of development: embryonic movements are initiated by active living matter, without the intervention of an external principle. This germ-layer theory, of course, had an important advantage for a supporter of epigenesis: because the early stages were extremely simple, i.e., single membranes, they confirmed both the absence of any preformed embryo in the egg and the gradual formation of the animal. Furthermore, whereas many previous authors began by depicting each organ in its final and completed state, before studying its origin, Pander described the events chronologically and, in doing so, he emphasized that he did not define the germ-layers by their fate. He wrote: Actually, a particular metamorphosis begins in each of these three layers, and each hurries towards its goal; only each one is not sufficiently independent to represent by itself that for which it is destined; it still needs the help of its companions, and therefore, although already designated for different ends, all three work collectively until each has reached an appropriate level. (Pander 1817b, p. 12)

This text certainly does not completely preclude germ-layer specificity, but it underlines the fact that these membranes do not correspond rigorously to definite organs in the future embryo, they are only anatomical structures with no predetermined specificity.

The plates Pander and d’Alton’s plates raise several epistemological problems that are closely related to this question. First, from a general point of view, there was a major change in the scientific status of images in the late eighteenth and early nineteenth centuries. During the Renaissance, figures in zoological or botanical books mostly offered a summary of information with a paucity of graphic quality and resemblance. As historians have suggested about anatomical books of the sixteenth and seventeenth century, plates did not aim to demonstrate something, but rather to confirm, to support a discourse, and to organize knowledge. To a certain extent, this perspective survived into the middle of the eighteenth century. Réaumur, for example, claimed that figures do not say more than the text. He wrote, “drawings say what they have to say more quickly, however, they cannot always represent all that we would like them to represent.” (Réaumur 1734-1742, vol. 1, p. 52) He considered that their role was auxiliary, and that they were not able to bring new knowledge. But other naturalists manifestly did not share the same idea and thought that plates could offer information that was absent from the text. We can find an example of this promotion of images in the work of the French naturalist and traveller Michel Adanson (1727-1806) who, in his Familles des Plantes (Adanson 1763), devoted a chapter to “ways to make figures useful” and claimed that, even if the drawing could not render certain properties such as the smell or taste, [they] showed the general form [le port] of the plants, the location and the disposition of their parts, all things which are more essential than the qualities mentioned above and which it is generally impossible to render accurately enough in a description. (Adanson 1763, vol. 1, p. clxxxiv)

This promotion of illustration in biological descriptive sciences was very common at that time. For example, the illuminated plates in Buffon’s Natural History of Birds had a certain pre-eminence over the text. Buffon wrote that they are “better than a long description, as which would be boring

35

Stephane Schmitt

as well as difficult, and always very imperfect and very obscure.” (Buffon 1770-1783, vol. 1, p. vi) There were many consequences: first, the naturalist had to draw the illustrations or to carefully supervise their making by a professional illustrator. The drawings not only had to render the reality as accurately as possible, of course, but the illustrator’s eyes also had to be guided by the scientist. This is precisely what Pander did with d’Alton. This was a general trend around 1800 in biology. In comparison to natural history, however, embryology raised additional, more specific problems. In particular, the role of interpretation is obviously much greater in the use of embryological plates than in the case of representation of complete plants or birds. There is a close and necessary link between these plates and the underlying embryological theory. An extreme case is the representation of homunculi in sperm in the late seventeenth century to illustrate the preformation theory. Wolff’s plates are relatively schematic for the early developmental stages and become more complex for later stages. But the plates engraved by d’Alton for Pander’s German edition are interesting from an epistemic point of view because they aspire to a perfect objectivity. They include no legend in themselves, but are covered with tracing paper that shows the schematic interpretation and legends. There is only one plate without tracing paper (one of small microscopic details), and, in fact, there is one tracing paper without an actual plate. It represents the crosssections of different stages, i.e., the only plate representing an (assumed) intervention by the scientist on the embryos. So, except in this unique case, there is no interpretation at all in the plates themselves, as the interpretation is all in the tracing papers.



Figure 1 A and 1 B: From Christian Pander 1817b. Beiträge zur Entwicklungsgeschichte des Hühnchens im Eye, Würzburg: Brönner.

I do not know of other examples of such a dissociation in a given image between observation and interpretation in the embryology of the early nineteenth century (although we can find earlier examples in anatomy: e.g., Traité de physiologie by Vicq d’Azyr (Vicq d’Azyr 1786)). Pander shared Döllinger’s epistemology, that is, the primacy of observed facts over speculation. Goethe and some Naturphilosophen (e.g., Oken) had a different conception of the relationship between observation

36

Pander, d’Alton and the Representation of Epigenesis

and theory. Pander’s attitude is interesting for us because he seems to be one of the first (and one of the only) to have understood that even the lightest intervention of the embryologist on the drawing (simply marking a boundary between two regions of the embryo or giving a legend to one thing rather than another) had consequences for the interpretation of the image. I do not mean that he succeeded perfectly and that d’Alton’s drawings were completely objective (we know, for example, that the choice of the represented stages is by no means neutral), but he felt that there was a problem with the representation of embryogenesis, and he raised an epistemological question that is prevalent throughout the history of embryology. Pander’s choice is closely linked, I suggest, to his interpretation of the nonspecificity of germ-layers.

Comparison with von Baer’s plates; specificity of germ-layers, crytopreformationism If we compare Pander’s with von Baer’s plates, for example, we see that many of them are completely different, esp. in Part I of Entwicklungsgeschichte der Thiere (Baer 1828-1837). With von Baer, we see very simplified representations of the different germ-layers, with false colors, dotted lines, arrows (representing metabolic pathways or movements; see (Tammiksaar and Brauckmann 2004)). So there are many kinds of assumed interventions by the scientist on the images to capture a reality that would be invisible if the drawings were just true representations of what we can see at a certain time (like a photograph). Since von Baer attached much importance to the illustration of his treatises, the kind of representation he chose might be closely linked to his conception of germ-layers. In particular the color choices allow comparisons between different species (e.g., birds and mammals). But they also reinforce the idea of the well-defined identity of each germlayer, and its specificity in space and time. So the spirit of von Baer’s theory is different from Pander’s ideas. The meaning of germ-layers, according to Pander, was a purely topological one; on the contrary, for Baer, the histological and typological value of each layer was of prime importance. So, for him, the germ-layer theory offered, of course, a solution to the problem of epigenesis, but the quest for a specificity of each layer can also lead to a sort of “cryptopreformationism” (or at least to a “less epigenetic” conception than Pander’s). This dichotomy in the germ-layer theory as early as the early nineteenth century probably had many consequences for the history of this notion and the tension between an absolutely epigenetic vs. a cryptopreformationist interpretation of it. The images, therefore, had a very important part in this history.

37

Stephane Schmitt

Bibliography Adanson, Michel. 1763. Familles des plantes. 2 vols. Paris: Vincent. Baer, Karl Ernst von. 1828-1837. Über Entwickelungsgeschichte der Thiere. Beobachtung und Reflexion. 2 vols. Königsberg: Bornträger. Buffon, Georges-Louis Leclerc. 1770-1783. Histoire naturelle des oiseaux. 9 vol., Paris: Imprimerie Royale. Pander, Christian Heinrich. 1817a. Dissertatio inauguralis sistens historiam metamorphoseos, quam ovum incubatum prioribus quinque diebus subit. Würzburg: Nitribitt. Pander, Christian Heinrich. 1817b. Beiträge zur Entwickelungsgeschichte des Hühnchens im Eye, Würzburg: Brönner. Réaumur, René Antoine Ferchault de. 1734-1742. Mémoires pour servir à l’histoire des insectes. 6 vols., Paris: Imprimerie Royale. Roger, Jacques. 1993. Les sciences de la vie dans la pensée française du xviiie siècle. La génération des animaux de Descartes à l’Encyclopédie. Paris: Albin Michel. Tammiksaar, Erki, and Sabine Brauckmann. 2004. “Karl Ernst von Baer’s Über Entwickelungsgeschichte der Thiere II and its Unpublished Drawings”. History and Philosophy of the Life Sciences, 26(3-4): 291-308. Vicq d’Azyr, Félix. 1786. Traité d’anatomie et de physiologie. Paris: Didot l’Aîné. Wolff, Caspar Friedrich. 1759. Theoria generationis. Halle: Hendel. Wolff, Caspar Friedrich. 1768. “De formatione intestinorum praecipue, tum et de amnio spurio, aliisque partibus embryonis gallinacei, nondum visis”. Novi Commentarii Academiae Scientiarum Imperialis Petropolitanae 12: 403-507. Wolff, Caspar Friedrich. 1769. “De formatione intestinorum praecipue, tum et de amnio spurio, aliisque partibus embryonis gallinacei, nondum visis”. Novi Commentarii Academiae Scientiarum Imperialis Petropolitanae 13: 478-530.

38

Pattern of Scanning, Tracing and Assemblage

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants: From the Herbarium to the Engraving via Tracing Marianne Klemun

Plant illustrations, in different representational forms, have formed an integral part of the equipment used in botanical work since at least the seventeenth century. The plant illustrations generally had a multifunctional significance. They were used for the exchange of knowledge about plant varieties among botanists (who were mostly taxonomists) working in different locations, where the pictures were often exchanged by letter together with dried plants, or instead of them. The illustrations served to clarify the striking or relevant characteristics of different species, and they were used as models to highlight the defining characteristics of a species or distinctive features. Plant illustrations had a special status in communication: the function of underlining an argument in the process of classification. They also affected the authentication or constitution of specific notions of species. In this way they enabled botanists to share their knowledge within the scientific community. But not only that, plant illustrations also played an influential role in the collection and dissemination of knowledge and were one of the integral features of botanical learning. We also have to consider that through the images, botany itself managed to generate special attraction. “No modern science has a more visual history than botany,” as Martin Kemp pointed out, “and no other scientific illustrations have been more widely admired for what we may call aesthetic reasons” (Kemp 1996, p. 197). For a long time, the aesthetic aspect of plant illustrations was naively considered only from an artistic point of view, as can be seen in older literary sources. Nevertheless, the fact that recent research accentuates the importance of plant illustrations for the scientific discipline of botany should not lead us to ignore the aesthetic aspects entirely. Therefore, I will specifically consider the aesthetic component here. As Nickelsen pointed out, the artistic ability was deliberately used to emphasize certain taxonomically relevant properties, especially those determining species and genus (Nickelsen 2006). The focus on the depiction of individual features and their accentuation in the drawing supported the botanical statement. Additionally, plant illustrations respond to the aesthetic feeling of the viewers who were botanical experts and also to consumers without scientific knowledge but with a great interest in botany, which was initiated and enhanced by these paintings. The effect on a broader audience beyond that of experts is also an important aspect in analyzing the matter of images. The eighteenth and nineteenth centuries were a time of massive change for botany from a leisure-time activity of amateurs into a profession. The field of taxonomy (not physiology, which belonged to chemistry) was a part of natural history. However, the more botany was professionalized, the more interest its proponents showed in expanding beyond the purely professional area. Within this time span, a number of fundamentally different taxonomies were developed and adopted, and then flourished. The Linnaean system was well known and widely 



E.g., Nikolaus Joseph Jacquin repeatedly sent drawings and illustrations to Carl von Linné at Uppsala to exchange knowledge with his collegue. See Linnaean Society London, Letters from Jacquin to Linné. (Linné praised Jacquin for his drawings, see Letter from Linné to Jacquin, February 26, 1760, nr. L.2679). For the functions of images, see Kärin Nickelsen 2007.

41

Marianne Klemun

taught. By the turn of the eighteenth century, and in particular after it, this system was replaced by many other natural systems, and, thereby, botany as taxonomy reached a dynamic stage. Compared to verbal descriptions and the three-dimensional material object (the herbarium), depictions of plants should be seen neither as complementary to a text nor merely as illustrations. For, in truth, they hold a different and independent meaning. A picture adds value to a text – Boehm called this in general terms “the iconic difference” between picture and reflection – and this additional value was conceptualized by contemporaries with detailed expert knowledge. Indeed, plant illustrations had the purpose of explaining the relationship between the specific relevant features of species for the expert through the use of visual emphasis and of visually presenting this relationship, as Nickelsen recently demonstrated in a comprehensive study, primarily using illustrations from the eighteenth century (Nickelsen 2006). In addition, Anne Secord showed in her study that botanists valued plant illustrations mainly because they offered great potential for teaching the skill of observation – a feature that was generally ascribed to the paintings: Lectures and writers aimed to inspire botanical activity in their audiences (whether lay or learned public) because in the first half of the 19th century they did not regard popular botany as a diffused knowledge for passive consumers. What underlies the approach of all the botanists I have considered is the emphasis on training reliable observers. All of them start from the point of being a learner who clutches a plant. Whether botanists then recommended the beginner to identify the plant by comparison with an illustration or a written description, their aim was always to develop the best means to improve and sharpen the skill of observation which is essential to the practice of botany, and to retain this knowledge in the mind. (Secord 2003, p. 55)

Apart from purely historical questions, such as how botanists adopted a new style of plant visualization that was guided by scientific purposes, needs, and goals, and how illustrations became increasingly important as instruments for training the skills of standardized observations, we should also ask the epistemic questions of how the operation of the sign itself was registered in the ultimate representation, and if and how the production of the drawing determined the visual end product. We know more about the process of transfer from drawing to print than about how the drawings themselves were created. Of course, we do know a lot about the process of making illustrations, but only in a very few cases are we able to reconstruct exactly how the drawing was organized, how it was designed. Therefore, the main question I am interested in is how the drawing was initially produced, since this creation process was the first (of many) obstacles to be overcome. In her extensive studies about botanical illustrations of the eighteenth century, Nickelsen discusses how the illustration corresponds to the plant in life (Nickelsen 2000, pp. 95, 150). She focuses on the relation between the reality of the plant and the illustration and shows that for the majority of drawings, previously existing illustrations were used as models by copying and modifying them to optimize the quality of the pictures. Only in rare cases did the illustrators work directly at the natural site but rather they examined the collected plants in the study. In addition to the act of drawing the original plant they saw lying in front of them, they also consulted already existing drawings by other experts. Thus, they actually employed a kind of combinational technique that aimed at creating the best picture of a certain species. I see no difference between the practices of the eighteenth century and my own case study here, in which the botanist Heinrich Gottlieb Ludwig Reichenbach (1793-1879) contributed to all phases of image production himself, 



42

Only a few botanists used other taxonomies, such as the one constructed by Antoine-Laurent de Jussieu. See Martin Langanke 2000, p. 52. Boehm coined this expression in his classical study, Boehm 2001, p. 15.

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

while making use of illustrations produced by his predecessors. Of course, he only copied the flowers from other pictures, not the whole plant. In a typical case, many different branches of knowledge and different persons played a part before a plant illustration printed on a plate reached the consumer. The taxonomist collected plants, dried them, and selected them and instructed the artist, the engraver, and the colorist. Only a few botanists were able to draw the plants themselves. The Viennese botanist Nikolaus Jacquin, who became famous for his excellent Icones editions, had even attended special courses at the Academy of Arts to be able to create certain elaborate prints (Hühnel 1993, p. 63). However, the botanist Reichenbach was an exception within the botanists’ community because of his extraordinary drawing skills, which enabled him to produce exemplars for print without any external help. As stated above, we have relatively little concrete information or material evidence about the items that were used as the basis to draw living specimens, either the herbaria or pre-existing illustrations. According to Nickelsen, fresh plant material, older depictions, and herbal documentations were used as models. The drawing itself was the result of a complex collation process that united different sources in one picture. This argumentation is based on the term “collation,” initially brought up by Müller-Wille following Linné who preferred this style of reasoning in which information from different sources was intertwined and presented in the description when identifying plant species (Müller-Wille 1999, p. 223). The process of drawing was very similar. The final output – the optimal picture – resulted from a collation of information taken from living examples, dried plants, and already printed illustrations. Based on archival evidence, I will stress the important role the dried plant played as a decisive source for the drawings in Reichenbach’s case and will demonstrate the problems that derive from the method by analyzing Reichenbach’s Icones. First of all it is important to take into account that many artists used dried plants as models for their work. Linné stated explicitly that taxonomically relevant properties are preserved both in images and in an herbarium:  The means by which a definition is selected are four: number, shape, site, and relative size; so they are the same as in case of the genus; Section 167. These are constant everywhere, in a plant, an herbarium, or a picture. (Carolus Linnaeus 1751 (2005), paragraph 282, p. 243)

This correspondence confirms that the connection between herbarium and illustration is highly esteemed and, therefore, even generated in an ideal fashion by experienced botanists, distinguished by their expert knowledge. Also in practical situations, the herbaria played a more important role than living plants, e.g., on journeys and during expeditions. When the specimens were lost and drawings existed, these pictures replaced the herbal exhibits to give evidence of the description of a new species. This function can be easily substantiated by looking to Nikolaus Jacquin and how he handled his collection that he imported from the Carribean (Jacquin 1762, see Hühnel 1992, p. 63). In the botanical department of the Natural History Museum in Vienna, a collection was recently found by Bruno Wallnöfer (Wallnöfer 2003) that was separate from the total herbarium 

Carolus Linnaeus, Critica Botanica, p. 2001, paragraph 282: Ex hisce quatuor dependet externa structura plantae, qua una nobis ab aliis diversa repraesentatur. Has notas, non alias, repraesentamus in iconibus. Has conservamus in herbariis vivis: reliquiae omnes accidentales esse possunt. Hae [sic!] non dimuttunt Lectorem incertum & dubium; Hae [sic!] verba sunt ponderis & valoris. – These four determine the outer appearance of a plant by which one distinguishes it from others. These properties and no others we represent in an image. These we preserve in an herbarium: the remaining could all be accidental (translation by author). See also Nickelsen 2006, p. 94.

43

Marianne Klemun

and that could be related back to a concrete example of illustrations in a printed work. Since the labels and numbering of the sheets in the herbarium could be identified with the printed work Icones Florae Germanicae et Helveticae, and the herbarium itself included additions, the link between the real objects and their representation is evident. Although the direct relationship is beyond doubt, this poses more questions than it solves, since the three-dimensional exhibit (the herbarium sheet) is also artificially and scientifically arranged by the botanist: it is not simply real or real nature, but an epistemic object. While in the eighteenth century botanists normally employed good draftsmen working under their scientific supervision to accomplish all the transfer activity from reality to illustration, this special collection from the first half of the nineteenth century demonstrates that its creator, Reichenbach, clearly did most of the work himself. Without doubt, the herbal exhibits originate from Reichenbach himself. He arranged them on the paper and added pencil sketches such that the expert could use them directly for the engraving. Before presenting Reichenbach’s herbarium in more detail, I will outline how a herbarium sheet represents an “epistemic thing” (cf. Rheinberger 2001), distinct from the plant itself. For the plant is transformed into a herbarium sheet only after an intensive processing of drying, of preservation against a background (the paper), of being placed beneath an inscription and of using writing to identify it. For instance, the dried example is placed on a sheet and labelled with a note of the place and date and a reference to the collector. In addition, it is given its scientific name and, if necessary, the inventory number of the collection or institution. The herbarium specimen with its label represents a relationship among five items: the plant as an entity, the collector, its classifier, the place of discovery, and the institution. Together with its label, the herbarium sheet itself constitutes a document, a kind of entity. It becomes the claim of the collector who, as the first owner, secures for himself a place in the collective memory. Even in the course of a revision, the first inscription remains and is supplemented by further inscriptions, even when the classification yields no new result or merely confirms the original description through its type definition. Every revision requires a new label, irrespective of the fixing of the nomenclature of the plant, to which, according to the rules of nomenclature, there can be no further alterations. This continues to be the practice and was already widely used during the nineteenth century. Let us now focus again on the recently discovered herbarium, the source for the printed illustrations. Its sheets differ from a herbarium as we usually know it because they feature no labels but are nonetheless inscribed with the names of the plants and the numbers of the illustrations, without mentioning the places where they grow. These numbers directly link each sheet to the planned and eventually realized print and not to an independent collection of dried plants. Blossoms and fruits were either added as pencil sketches or by using existing illustrations that were stuck onto the sheet. That means that Reichenbach manipulated the original pattern, the herbarium sheet, by adding specific details to the dried plants with the pencil. Therefore, the herbarium sheet contained the artificially arranged dried specimen and also the drawn additions in a comprehensive assembly. Furthermore, written notations in longhand can be found in the margins, giving instructions on how the exemplar could be optimized. It says, e.g., Blumenblätter regelmässiger (petals more regular). Such indications, a disruption to the original schema that 



44

For a general overview, see Arber 1919; for the concept of inscription, or system of notation, see Kittler 1998; Lenoir 1998. Natural History Museum [Naturhistorisches Museum], Botanical Collection [Botanische Sammlung] herbarium sheet [Herbarblatt] 06039 Paenia corolina (corresponding to plate CXXVIII, in Vol. 4.)

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

aimed at regularity and symmetry, occur rather frequently. They prove that the original specimen – the artificially constructed herbarium – was actually altered in a second step to search for a picture that transcended simple representation. For this, in another context, Reichenbach (1837, p. 20) uses the term Klarheit (clarity). According to his point of view, clarity should be a given precondition during any observation. He was mainly concerned with the features of simplification and distinctness, which were certainly supported by using tracing paper. However, they were only effectively provided by the additional interference of the illustrator.



Figure 1: Herbarium sheet, Pulsatilla vulgaris . Botanical Collection, Natural History Museum Vienna (Courtesy of Natural History Museum Vienna).

45

Marianne Klemun

The tracing paper is especially significant in this context. For between the dried plant on paper (the herbarium sheet) and the drawing, there is the mediation of the tracing paper serving to copy the plant outlines in their original size. Now what function did this procedure have? The contouring technique that resulted from the use of tracing paper was a traditional procedure. Jacob Trew (1695-1769), one of the most successful drawers and botanists of the eighteenth century who united both necessary abilities in one person, described the contouring technique: First, I placed the items to be depicted in such a way that everything which had to be noted was visible; second, the painter made an outline of it; third, I examined the outline and compared it to the specimen; fourth, the painter completed the drawing; fifth, I held the completed drawing next to the specimen again; sixth, the painter and I re-examined the copperplate in comparison to the drawing. (Trew 1740, paragraph 13) 

The parts of the dried mounted plant on the sheets that were found in Vienna are about the same size as the printed plant illustrations. It can therefore be assumed that Reichenbach used them as a direct template for the print. The differences between the herbarium exemplar and the print are so minute that one could call it a reproduction that is strictly bound to the original. And yet, the two stages differ significantly: The arranged herbarium sheet was used as an instrument and point of departure to develop the optimal image of a plant. The representation was copied via tracing paper, reproducing the contours, and in this way, the three-dimensional object was transferred into a two-dimensional one. Despite the strong attention to details and the minimal differences between original and illustration, the dimensions of the paper image were created by suggesting only contours of some leaves instead of coloring all of them during the printing process. Thus, direct references to foreground and background emerged within the illustration. This was mainly effected by accentuating and concentrating on specific leaves by varying the intensity of color. Some leaves are even presented without color. The three-dimensionality of the illustration resulted from this process. For what audience did Reichenbach commission these illustrations? Reichenbach, previously professor at the University of Leipzig, went to the Medical Academy of Dresden as Professor of Natural History in 1820. He was also director of the botanical garden and the royal natural history collections. Despite its collections, Dresden was not a place famous for institutionalized natural history before 1820. However, after the Napoleonic war the state of Saxony founded several institutions in its capital Dresden, e.g., in 1815 a medical academy opened there. Several private scientific societies were also established with members from the state bureaucracy, the medical academy, and the local medical professions. Within this circle, Reichenbach became a very famous figure. During his first decades in the city, Reichenbach worked on his own natural system of botanical classification. The reform of botanical systems, which I mentioned before, was a dominant enterprise in European science in the early nineteenth century, with more than twenty different classification schemes proposed to replace Linné’s artificial sexual system of classification. As Reichenbach’s own scientific interests lay within the broader field of Naturphilosophie, his botanical system was similar to that of Lorenz Oken and also influenced to a great extent by Goethe’s Metamorphosenlehre. He was clearly convinced by Hegel’s dialectic method of synthesis via thesis versus antithesis, which he used in his own system as the unifying thread of argumentation. To relate Hegel’s method to Reichenbach’s approach, the herbarium sheet functioned as thesis, the tracing paper as antithesis, and the published illustration as synthesis. Botanical systems of classification could only be successful if they were accepted by a large number of people, and many of Reichenbach’s publications, including his Icones, attempted to  

46

See also Nickelsen 2006, p. 31. See Nekrolog (H. G. L. Reichenbach) 1879.

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

attract new people to botany through his natural system, through the descriptions and the printed illustrations. Whom did Reichenbach want to reach and win over to natural history? First he largely approached Dresden’s literary circles, the group of aristocrats and middle-class men and women who gathered in local literary societies such as the city’s Late Romantic Liederkreis (Doehring-Manteuffel 1935; Phillips 2003). When he founded a botanical and horticultural society in 1828, Reichenbach recruited a number of prominent literary and artistic personalities to join him in his interests in plants. He also undertook several joint projects with the city’s leading literary journal, the Abendzeitung, which addressed a cultivated audience. The publisher argued that after a time Dresden also deserved renown for its exceptional “sensibility to the world of plants” (cit. by Phillips 2003, p. 50). As Reichenbach regarded women to be the most sensitive creatures, he wrote a Botany for Women (Reichenbach 1837b) in which he attempted to introduce his natural system and his knowledge of Goethe’s Metamorphosenlehre for the first time. Although he did not undervalue the female recipients, they constituted an easier testing ground for him when dealing with questions of taste. In 1830 Reichenbach found a local audience that shared his passion for carefully collecting, identifying, and collating natural objects, especially plants, the Isis Society. This society included local apothecaries, a clerk at the royal library, and many others of similar professions. The number of members increased to more than 90 people who shared a special culture, a taste that was educated through the observation of nature, and who wanted to be engaged in self-education and in work concerning the region. Reichenbach did not belong to the group of liberal people; as a conservative he was a great supporter of the monarchy with a personal relationship to the king through their weekly botanical discussions. Nevertheless, he was convinced that the study of natural history would create “devoted peace-loving citizens” (cit. by Phillips 2003, p. 57). The Isis Society moved towards addressing a nonregional audience, while Reichenbach tried to combine botany with art and bring this to the attention of both a regional audience and one that showed consciousness of the “German identity.” In his justification of the natural plant system, Reichenbach wrote: Here [Dresden], where art and science meet, there seems to be a blessing whose blossoming allows the mutual attention and respect of these seemingly incongruous disciplines to bloom forth, and in the enjoyment of this blossoming, releases one from prejudice in judgement and dealings. (Reichenbach 1837a, p. 22)

Botany seems to be particularly well-suited to this, for Reichenbach himself, recalling the art of drawing from nature and plants, stated: a darkly imagined struggle to be obliged to make everything one has seen evident to oneself and to others opens up to the outside world the apparently inborn gift of drawing, and my mother’s brother, faithful Uncle Bartel, as a practising artist recognised my small talent and directed all my observation to reproduction through drawing. So I embarked on the great journey of learning to walk and to travel. And through my good father and the other teachers in the venerable Thomana, the observation of living creatures took on a direction that yielded to that of classical education, or at least constituted a strong parallel. (Reichenbach 1837b, p. 19)

In his later life, the writer developed the conviction that without such parallel courses there would never have been a clear-thinking natural scientist, nor could there ever be. For only through such parallels can clarity be created, perhaps for both fields. Classical education, art, and nature – it is in this triad that the illustrations presented in the printed books are constituted. The book project of the Icones began in 1834 and continued beyond Reichenbach’s life into the twentieth century. More than 1000 pages were produced altogether.

47

Marianne Klemun

Wolfgang Kaschuba stated that the symbolic display of aesthetic sensibility was the common code in nineteenth-century bourgeois culture, a uniting thread between the divergent groups that came to make up the German middle class (Kaschuba 1988). Pleasure and taste, so firmly rooted in the fabric of middle-class culture and botany, the lines within Reichenbach had wooded Dresden’s literary elite circles showed outcomes. Nature is the ideal reality for human beings this was the central postulate of Reichenbach’s social, cultural, morphological, and naturalphilosophical orientation. But how was such an ideal reality experienced? Collective excursions into the natural world formed the basis of access; he repeatedly emphasized the parallels between observation and knowledge: What is achieved through observation, the pure and simple experience that can be provided by observing a single organism, what is experienced of the living thing by observing the whole, all these must be clearly reflected. (Reichenbach 1837a, p. 22)

Clarity is a further key concept to grasp Reichenbach’s claims concerning the aesthetic form of the illustrations. From the instances shown in the observation of individual plants, based in plant life, we may derive an observation of the world of plants in which, by virtue of repetition of those instances, the world of plants seems to be a unity, comparable in its development to one of its most highly organised individuals. (Reichenbach 1837 a, p. 7)

This remark allows the conclusion that, for Reichenbach, an herbarium is only an instrument on the way to securing higher knowledge. It has the function of making an observation – derived from a living thing – capable of repetition, and bringing about an association of an ideal recognition of nature. If we look again at Reichenbach’s herbarium and the printed illustrations and compare these items with other collections of herbaria and illustrations, we are struck by the specificity of Reichenbach’s visual arrangements (Figure 2). Unlike the common practices of the eighteenth and nineteenth centuries, where usually a single isolated plant was mounted on a sheet, Reichenbach usually combined species varieties on a single sheet or different species within a family. Such a rare practice is realized also in a collection organized by Professor John Stevens Henslow (1796-1861) of Cambridge whose aim of “collation,” as he called it, was to analyze the limits of variation within “created” species, as a recent paper shows (Kohn et al 2005, p. 643).10 The special patterns of displaying plant sets turned out to be influential on Darwin’s concept of evolution and “the Henslovian framework he had been given at Cambridge switched into a new configuration” (Kohn et al 2005, p. 645). Concerning Reichenbach’s collated sheets, we may argue that the practice of comparing species as a strategy of observation was specifically stressed here by the combination of several plant species or species varietis on one sheet. When comparing Reichenbach’s different presentations of plants (e.g., the herbarium sheet with the printed illustrations), we can now summarize that the herbarium showed obvious differences from the common practice of the rest of the scientific community at the time. In a sample of more than 200 sheets, I found only two cases of labels that declared the dried plants in sensu herbaria. All the other samples are labelled consistently with inscriptions, referring to one integrative transit. The herbarium itself was solely created to provide exemplars for the prints.

10

48

With grateful thanks to Maura Flannery (St. Johns University New York) who gave me the hint for the article about Henslow’s herbarium.

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants



Figure 2: Printed illustration. Pulsatilla vulgaris from Reichenbach, Icones Florae Gemanicae , (Courtesy of Library of the University of Vienna).

Although the size and layout of the hebarium sheet in Figure 2 are identical to the printed image, there are some interesting deviations in the print compared to the herbarium presentation. In many cases the individual plants seem to have been presented in a simplified manner, and the contours of the plant leaves and their veins, not least of all, have been made more prominent by the tracing-paper method. The Icones are not represented by live examples, but by the herbarium sheets. Nevertheless, the knowledge of living examples is also part of the drawing process.

49

Marianne Klemun

Handwritten notes on the herbarium sheets support this interpretation. For instance, there is an instruction for the print on the herbarium sheet that the panicle “should not be so curved, but more regular and that all the flowers should be put into a hanging position.”11 Botanists of the Natural History Museum saw evidence of a type of representation in the technique of tracing and the use of the herbarium and interpreted Reichenbach’s undertaking as an attempt to come as close as possible to nature itself. For Reichenbach, this was by no means the case. From his point of view, the herbarium provided only a basic form of orientation, but the drawing with tracing paper supplied him with an outline to create the image by an additional drawing process in a second step of intervention in completing the drawing. Above all, the luxuriance of a plant is sacrificed in favor of clarity, and the idealized presentation is reduced to essential features. This emphasizes the variations among different individual specimens (as Reichenbach described and defined many new taxa) and the morphological type that corresponds to the three dimensions of the plant image created by suggesting only contours of some leaves instead of coloring all of them in the print. This specific method of coloring underlines Reichenbach’s morphological concept. I will use the term “type” here in a morphological sense, a construct that includes the average properties of a species.12 Root, stem, and leaf are the basic categories in the notion of a morphological type, and in terms of their location and constellation, they are defined in broad outlines through their illustration in the herbarium. For the sake of symmetry, however, leaves are thinned out, and the flowers are mostly copied from other publications and illustrations, but not the representation of stem. The artist’s hand intervened in the herbarium sheet with a pencil. Compared with the end product, these drawings were crude, particularly since the engraver was given precise instructions as to which printed illustrations he was to copy with respect to a particular flower.13 The illustration always includes different and improved details compared to the herbarium. An argument for my emphasis on morphological features can be found in Reichenbach’s “natural system” and in the following sentences: You see, dear reader, how the plant species, which we have called a system, has been allowed to vary in opposition to its nature mostly in the central area, in its development from the stem, and in its leaves; whereas the blossom and fruit – if you wish, as something positive – form the species type far more consistently, but each is also necessary, faithfully retained for you in synthesis, and only developed more consciously. (Reichenbach, 1837 a, p. 322)

In addition to the taxonomic work, the botanical discourses of the period are to be looked upon as epistemic references for these illustrations. Reichenbach’s contemporaries were very concerned with the conceptualization of a dynamic morphological type concept. Morphology worked with a “mental image” that derived abstract forms from the diversity of plant phenomena. This powerful idea was also generally found in the representations conceived by Reichenbach, who adopted Goethe’s regime of plant morphology in his images:14 11

„B. Nicht so sehr bogig, mehr regelmässig, Alle Blüthen hängend!“ – Natural History Museum [Naturhistorisches Museum], Botanical Collection [Botanische Sammlung] herbarium sheet [Herbarblatt] 06057 (corresponds to plates XXXI of the print). 12 This term should not be confused with the nomenclatural type or holotype, which refers to the particular, physical specimen of a species on which the first definition and description is based; see Wagenitz 1996 and Jahn 2005. 13 Reichenbach used as a source for his illustrations (as he noted in his herbaria sheets) Friedrich Gottlieb Hayne’s Getreue Darstellung und Beschreibung der in der Arzneykunde gebräuchlichen Gewächse, wie auch solcher, welche mit ihnen verwechselt werden können (1805-1837). 14 In this sense Baron spoke of a classical or idealistic morphology, Baron 1968, p. 21; concerning the metaphor of mental image, see also Doughterity 1994, p. 14.

50

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

For the complex of the existence of a true being, a German has the word Gestalt (form, shape) which abstracts from what is changeable and assumes that an essential entity is determined, concluded and fixed in its character (cit. by Jahn 2005, p. 23).15

In the description of his system, Reichenbach associated the term “illustration” with the idea of a mental image, as it concerned him as an adherent of the new concept of type : For example, I would not be able to understand any kind of Veronica if the whole genus did not show itself to me in the appearance of the species just as I have traced it as reproduced from nature in the Flora Germanica, from its simplest beginnings of the auxillary-flowering species, through the antithesis of the auxillary flower clusters to the higher order synthesis of the refined auxillary condition to the terminal cluster or spike. My way of looking and thinking has developed in such a way that in the reductionism that many writers pursue I feel myself pulled backwards and would gladly hasten from such circumstances into the surroundings of a freely continuing life and the freedom of nature. (Reichenbach, 1837 a, p. 24)

Image and form reinforced one another. But Reichenbach was aware that he incorporated the “fragmentary dissection” of Linné (as he called it) very harmoniously into his own knowledge, as a consequence of synthesis, and that the plant illustration did not always fulfill the demands that he made of himself (Reichenbach, 1837 a, p. 94). This was shown when he stressed: All researchers stretch out fraternal hands to one another. One depends upon another and so develops a broader view. He rejects none of those facts which, when mastered for all, must be recognised as true; he does not selfishly shut himself away in his own system, indeed out of inner spiritual necessity, he gives up his own view when he finds it untenable. He must then abandon it. Everyone and everything may be possible. These are pure and noble feelings, and they are the triumph of natural scientists, who therefore should, indeed must, form no sects, no castes, but a large and benevolent family. (Reichenbach, 1837 a, p. 323)

The fact that science does not function in this way is quite another story! Finally, I want to emphasize once again that in the case of Reichenbach’s great undertaking with the Icones, which served to bring people to botany, and to persuade them of his mental images and the intellectual practice of synthesis, we have to acknowledge that his perceptual method was to combine purely taxonomic features with his concept of morphological types. The use of the herbarium served the purpose of showing the size of a plant and the relationship of its components, and, therefore, of endowing it with three-dimensionality. Tracing was introduced to mark the configuration of a plant and to reduce it to two dimensions on paper. For the copying process, the details of the flowers and fruits were taken from other illustrated works. Within the print, threedimensionality was generated by the specific use of coloring for the plant leaves, and many of them were presented without color emphasizing the image of the plant as a type in a morphological sense.

Discussion Fiorentini: I think the practice of making such images is really crucial to your argument. You say, you are doing objective reproduction, but at the same time you argue for optimizing the image towards what? Towards idealization – idealization of a time or idealization of a general type, or of species, or what else? I mean, there is nothing new, just extracting singular features in order to extract something which is not visible, a species or overarching type of plants.

15

See also Kuhn 1988, and Harlan 2002, p. 33.

51

Marianne Klemun

Klemun: Many questions – the objective ground is only the tracing paper. The ideal is the gestalt of the plant and this is not Linné because for Linné you have the features of the plant to describe the species, but there is more in the images of Reichenbach. There is also a sort of taste, a style of thinking in gestalt. So you have layers, the first one is to optimize and to get in all the tradition of classifying the species, that is one, and the next one is to be in an aesthetic space of taste. You see it when you look at this slide. This is how he worked, here are some flowers. He worked also in the Herbar, and I have not seen this before. I have not seen any Herbar where there was something drawn with a pencil. So, when you look at that [slide], you see, it is not only that you learn how to describe and classify the plant, this would be Linné and the taxonimist’s goal, it’s more that he wanted to give an image of the plant that has a development. Bruhn: Also a question of the technique, do you have an idea whether there is also otherwise a technique of copying the gestalt of the plant? Likewise before, when artists did using paper and needles to trace it. So, do you have any traces of needles that they are using when they produce it? Or, do you have an idea whether they experimented with early types of photography? Perhaps, like Wedgwood did, the first surviving photography that we have of him, that is exactly from that time. Klemun: We have a lot of different practices. There were botanists who would say, “No, we don’t want to have a drawing after the herbarium.” So, they claim – I don’t know whether they really did it – but they claim that they wanted to draw [from] nature, and according to the living plant. This is only a claim, I would say. The other point is that we know a lot of how the drawing was made and how it was transferred to the print because that is the real technique, but you don’t have to be a botanist for it. Here you need professionalists who work in the printing office, and there you work with the tracing paper. But then you change the mirror. So, there is lot of the technique in this stage. It was very important for the botanists to draw, except Linné who was not able to do that. So, he was not that close to the image. For him the picture was not as important as for other botanists, but the drawing itself was a very important technique of understanding of plants. Brandstetter: The very truth about the role of the observer – you made the point that there were demands for the modes of observing. And there must be differences [in] what the observer demanded for this picture, function and models. And I was wondering – especially when you mentioned that they were copying the originals in original size, and you showed this sort of very vivid images – whether there is a mimetic moment or a deception involved. If so, could there be any aesthetic function for such strange mimetic effects in these difficult illustrations? Klemun: I don’t know much about it, but we have some material about the society where he used his drawings. I think it played a crucial role because he tried to persuade not only botanists, he also tried to educate the society members to understand his mimetic process, and for the society this was learning, observing. And I think he needed it for the ladies he taught botany. I do think that it was meant to show how the imaging process works, how the observation happens – going to nature, to see the living plant, to the herbarium and then to the print, to work through all these steps of learning. Flannery: My question, or comment is there is also dimensionality involved. You are going from the two-dimensional picture, but that picture is trying to portray the three dimensions. So, there are tricks – he had to introduce dimensionality tricks. It seems to me that when he is copying a plant to the two-dimensional surface and trying to bring it out to the three-dimensions, he’s got problems there, too.

52

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

Klemun: Yes, I think when you look at this plate, it is striking that he works in the print because he doesn’t work with the color, he has a special decision what he brings in color, and what not. But you can also use the argument that it is the scheme of coloring the plates which reconstucts the three-dimensioality. Johns Schloegel: I was just wondering if he does say something about the way he sets up and arranges the herbarium? Are there any criteria for it? Because to me some of the herbarium sheets seem to be very well-arranged in lieu of the future complication. It was a very strange picture, between crowded and in the same time having already the frame of the paper size. Klemun: I think the herbarium itself is an image because it is done for an image. So, you have many images, you have the image of the herbarium, which is constructed, and then you have the image of taxonomy, then he puts in the crucial fruits and the like, which are important for deciding [between] species, and on the third level is that he has a mental image that he saw the plant as a developing one. But we have not much work about herbaria as image, there is not yet much research done.

53

Marianne Klemun

Bibliography Anonymus. 1879. Nekrolog (H. G. L. Reichenbach). Sitzungsberichte der Naturwissenschaftlichen Gesellschaft Isis (1879): 1-8. Arber, Agnes. 1910. Herbals, Their Origin and Evoution. A Chapter in the History of Botany 14701670. 1st Edition, Cambridge: Cambridge University Press. Baron, Walter. 1968. Methodologische Probleme der Begriffe Klassifikation und Systematik sowie Entwicklung und Entstehung in der Biologie. In: Diemer, Alwyn (Hrsg.), System und Klassifikation in Wissenschaft und Dokumentation. Meisenheim: Hain, pp. 42-54. Boehm, Gottfried. 2001. Die Wiederkehr der Bilder. In: Boehm, Gottfried (Hrsg.), Was ist ein Bild?, München: Wilhelm Fink Verlag, 3rd Edition, pp.11-37. Doering-Manteuffel, Hanns Robert. 1935. Dresden und sein Geistesleben im Vormärz, Dresden: Risse. Dougherty Frank W. P. 1994. Goethes zoologische Morphologie und die Gesetze der Baukunst. In: Miscellen zur Geschichte der Biologie, Aufsätze und Reden der Senckenbergischen Naturfoschenden Gesellschaft 41, Frankfurt am Main: Waldemar Kramer, pp. 11-29. Gutman, Mathias and Michael Weingarten. 1996. Form als Reflexionsbegriff. In: Jahrbuch für Geschichte und Theorie der Biologie III, pp. 109-130. Harlan, Volker. 2002. Das Bild der Pflanze in Wissenschaft und Kunst: Aristoteles – Goethe – Paul Klee – Joseph Beys. Stuttgart/Berlin: Mayer. Hayne, Friedrich Gottlieb. 1805-1837. Getreue Darstellung und Beschreibung der in der Arzneykunde gebräuchlichen Gewächse, wie auch solcher, welche mit ihnen verwechselt werden können. Berlin. Hühnel, Helga. 1992. Botanische Sammelreisen nach Amerika im 18. Jahrhundert. In: Die Neue Welt. Österreich und die Erforschung Amerikas. Wien: Brandstätter, pp. 61-77. Jacquin, Nikolaus Joseph. 1762. Enumeratio Stirium plerarumque, quae sponte crescunt in agro Vindobonensi, montisbusque confinibus. Vindobonae [Vienna]: J. P. Kraus. Jahn, Ilse. 2005. Der Typusbegriff in der Geschichte der Biologie. In: Harlan, Volker und Ilse Jahn (Hrsg). Wert und Grenzen des Typus in der botanischen Morphologie. Hagen: Martina Galunder Verlag, pp. 15-30. Kaschuba, Wolfgang. 1988. Volkskultur zwischen feudaler und bürgerlicher Gesellschaft. Frankfurt am Main: Campus. Kemp, Martin. 1996. “Implanted in Our Natures”: Humans, Plants, and the Stories of Art. In: Miller, David Philip and Peter Hans Reill (eds.). Visions of Empire: Voyages, Botany, and Representations of Nature. Cambridge: Cambridge University Press, pp. 197-229. Kittler, Friedrich. 1998. 1800/1900. Discourse Networks, translated by Michael Metterer and Chris Cullens. Stanford: Stanford University Press.

54

Refined Concentration of Botanical Expert Knowledge and Images for Gaining Passions for Plants

Kohn, David, Gina Murrell, John Parker and Mark Whitehorn. 2005. What Henslow taught Darwin. How a herbarium helped to lay the foundations of evolutionary thinking. Nature 436: 643-645. Kuhn, Dorothea. 1988. Grundzüge der Goetheschen Morphologie. In: Grumach, Renate (ed.). Typus und Metamorphose. Goethe-Studien. Marbach am Neckar: Deutsche Schillergesellschaft, pp. 133-145. Langanke, Martin. 2000. Die Natur ordnen. Zur Genese und Status der biologischen Systematik. In: Jahrbuch für Geschichte und Theorie der Biologie VII, pp. 7-92. Lenoir, Timothy (ed.). 1998. Inscribing Science. Scientific Texts and the Materiality of Communication. Stanford: Stanford University Press. Linnaean Society London, Letters from Jacquin to Linné. Letter from Linné to Jacquin, Februar 26, 1760, Nr. L.2679. Linnaeus, Carolus. 1938. Critica Botanica, Leiden 1737, translated by Arthur Hort. London: Printed for The Ray Society. Linnaeus, Carolus. 2005. Philosophia Botanica (1751), translated by Stephen Freer. Oxford: Oxford University Press. Müller-Wille, Staffan. 1999. Botanik und weltweiter Handel. Zur Begründung eines Natürlichen Systems der Pflanzen durch Carl von Linné (1707-1778) (= Studien zur Theorie der Biologie 3, 1999). Berlin: VWB. Nickelsen, Kärin. 2000. Wissenschaftliche Pflanzenzeichnungen – Spiegelbilder der Natur. Botanische Abbildungen aus dem 18. Jahrhundert und frühen 19. Jahrhundert. Bern: Studies in the History and Philosophy of Science (Diplomarbeit). Nickelsen, Kärin. 2006. Draughtsmen, Botanists and Nature: The Construction of EighteenthCentury Botanical Illustrations. Archimedes, New Studies in the History and Philosophy of Science and Technology, vol. 15. Dordrecht: Springer. Nickelsen, Kärin. 2007. “Abbildungen belehren, entscheiden Zweifel und gewähren Gewissheit” – Funktionen botanischer Abbildungen im 18. Jahrhundert. In: Hofer, Veronika and Marianne Klemun (eds.). Bildfunktionen in den Wissenschaften (=Wiener Zeitschrift zur Geschichte der Neuzeit): 52-70. Phillips, Denise. 2003. Friends of Nature: Urban Sociability and Regional Natural History in Dresden, 1800-1850. Osiris (Special Volume: Science and the City) 18: 43-59. Reichenbach, Heinrich Gottlob Ludwig. 1834-1859. Icones Florae Germanicae et Helveticae, vols. 112. Lipsiae: Fridericum Hofmeister. Reichenbach, Heinrich Gottlob Ludwig. 1837a. Handbuch des natürlichen Pflanzensystems nach allen seinen Classen, Ordnungen und Familien, nebst naturgemäße Gruppirung[!] der Gattungen, oder Stamm und Verzweigung des Gewächsreiches, enthaltend eine vollständige Charakteristik und Ausführung der natürlichen Verwandtschaften der Pflanzen in ihrer Richtung aus der Metamorphose und geographischen Verbreitung, wie die fortgebildete Zeit deren Anschauung fordert. Dresden und Leipzig: Arnold.

55

Marianne Klemun

Reichenbach, Ludwig. 1837b. Botanik für Damen, Künstler und Fremde der Pflanzenwelt überhaupt. Leipzig: Carl Cnobloch. Rheinberger, Hans-Jörg. 2001. Experimentalsysteme und epistemische Dinge. Göttingen: Wallstein. Secord, Anne. 2003. Botany on a Plate. Pleasure and Power of Pictures in Promoting Early Nineteenth-century Scientific Knowledge. Isis 93: 28-57. Trew, Christoph Jacob. 1740. Osteologie oder eigentliche Fürstellung und Beschreibung aller Beine eines erwachsenen menschlichen Cörpers. Nuremberg: Adelbulner. Wagenitz, Gerhard. 1996. Wörterbuch der Botanik. Die Termini in ihrem historischen Zusammenhang. Jena: Gustav Fischer Verlag. Wallnöfer, Bruno. 2003. Über die Abbildungsvorlagen zu den Kupferstichen von Ludwig Reichenbachs “Icones Florae Germanicae et Helveticae”. Annalen des Naturhistorischen Museums Wien, 104 B: 553-562.

56

Viewing Chromosomes Soraya de Chadarevian

I recently became interested in human chromosome research, a field that only really took off in the mid-1950s propelled by concerns on the mutagenic effects of radiation in the context of nuclear research, weapons development and the Cold War. Work on chromosomes centred on the morphology of the objects and therefore was tightly bound to practices of visualization such as staining, drawing, and photographing. At our last workshop I spoke about the recount of human chromosomes in the mid-1950s and especially about the role of Joe-Hin Tijo’s metaphase chromosome photograph in establishing that there were 46 instead of 48 chromosomes and in claiming authorship of the ‘discovery’. Albert Levan, the second author on what became a landmark paper and the head of the laboratory at Lund University where the research took place, put more stress on the camera lucida drawings of the chromosomes that for him encapsulated the analytical work that proved the new count. Tijo had a point regarding the rhetorical power of his photographs. They became iconic images while Levan’s careful drawings of lined-up chromosomes that built on a long tradition of drawing chromosomes were all but forgotten. In my short presentation today I would like to focus on the practices involved in analyzing the chromosome images and on their change over time. This means that I will focus on the uses rather than the production of the chromosome images, although the two processes are of course intimately linked. I will follow three lines of investigation. Firstly, I will discuss how microphotographs were used to gain knowledge on chromosomes and their status as research objects. Secondly, I will analyze what cytologists did when they engaged with chromosomes and compare the work of the human observer with the procedures of the early automated pattern recognition systems. A final point concerns the question what changed with the new technique of chromosome banding that was introduced in the early 1970s. The analysis will provide insights into how knowledge about chromosomes was produced and about the role of images in knowledge making processes. The research is in a very preliminary stage.

Cutting and pasting chromosomes Drawing chromosomes became superseded as a technique in the mid-1950 not because of advances in photographic techniques but because of new preparation techniques for chromosomes. In the early work on chromosomes drawings were used to resolve overlaps. In addition, when drawn on separate bits of cardboard with the help of a camera lucida the single chromosomes could be handled and arranged according to length or other criteria. This technique was pioneered in the 1920s by the American zoologist Theophilus Painter who first established that the number of human chromosomes was 48, a number confirmed by many other researchers for over 30 years (Painter 1923, p. 305). With the new preparation techniques chromosomes were fixed in the metaphase where they are most condensed, the cells were swelled with hypotonique medium and the preparations squashed on the slide. This allowed the chromosomes to separate. The chromosomes could then simply be cut out from enlarged microphotographs, measured and ordered according to established criteria. Apparently this technique was more widely used in the

57

Soraya de Chadarevian

US rather in European labs where karyotyping was done by direct visual inspection of chromosome metaphases under the microscope. The transformation of objects on a photograph to something that can be handled and rearranged is something that interests me, although I do not believe it was unique to chromosome analysis. I would like to hear about other sciences where photographs were used in that way. A comparison with 3D-models that can also be manipulated and separated into parts would be of interest as well.

Measuring, measuring, measuring Next to counting, measuring played a central role in the practice of cytogeneticists. The British geneticist Lionel Penrose was among those insisting on the importance of measurement for the exact description and identification of chromosomes. The importance of measuring can also be gleaned from the voluminous tables that formed the core piece of work performed by the Human Karyotype Standardisation meeting in Denver in 1960. The classification of chromosomes was based on the relation between the shorter arm and the full length of the chromosomes (Human Chromosomes Study Group 1960, Table II, p. 7). But measuring was not all that was involved. At least it was not the undisputed way to go about analyzing chromosomes. One of Penrose’s correspondents set out to prove that measurement did not assist in the work of pairing and identifying chromosomes and that visual inspection was as powerful an approach. Surprisingly perhaps, confirmation for this position came from attempts to automate karyotyping. Karyotyping was difficult but also tedious. It was the limiting factor in plans to expand the technique for use in the clinic and for large scale population studies. Much research relied on the analysis of this large body of data. For all these reasons, from the late 1950s, attempts were under way to automate karyotyping. In the UK this project was promoted by Michael Court Brown, also known as the ‘father of population genetics’ and head of the MRC Clinical and Population Cytogenetics Unit in Edinburgh. On his initiative a Pattern Recognition Group was established in London under Denis Rutovitz with the exclusive aim to use modern computers to automate karyotyping. The plan was first discussed in 1964 but work did not start before two years later. Eventually the group moved to Edinburgh. The efforts of Rutovitz’s group built on the work of Robert Ledley at the National Biomedical Research Foundation in Bethesda who pioneered the use of electronic computers in biomedical research and, among many other things, had developed a ‘Film Input to Digital Automatic Computer’ (FIDAC), which automated the analysis of chromosomes. Rutovitz’s group quickly decided that a film scanner was not practical and that it was necessary to work directly down the microscope rather than from photographs. Using films meant loosing the possibility to alter the focus which resulted in a loss of definition. The efforts to develop an automated method of karyotyping provide insights into how the human observer went about his or her work. Besides exposing the limitations of the computers at the time, the work also highlights the specificities of chromosomes as visual objects. From the beginning the aim was to build a computer system that would do what humans do, adding accuracy and speed. The first step to achieve this was to gain a thorough understanding of the steps which were involved in karyotyping. Indeed, I found the best description of the task of karyotyping in a paper on automated chromosome analysis. Here is the gist of the description Rutovitz and his collaborators gave. 

58

For the following see Rutovitz at al. 1978, p. 307.

Viewing Chromosomes

Technicians who did much of the karyotyping were trained to follow a working protocol which was developed and tested over a long period of time. It allowed to quickly assess if a cell had any abnormal chromosomes and whether all 10 groups of chromosomes appeared in the right number. Training took a year; when completed it enabled the technician to analyze a chromosome set in 2-3 minutes with a maximum of 50 cells per day per observer. This time remained a limiting factor in surveys when many thousands of cells needed to be analyzed . Protocols involved the following steps: The technicians began by identifying and counting the shorter chromosomes, distinguishing the acrocentric, then the metacentric chromosomes. The procedure was repeated for the larger chromosomes eventually eliminating all chromosomes except of those of the C-group (chromosomes 6-12 and the X chromosome) which were then counted. Once banding came in (see below), the analysis was more exacting and took longer. Nevertheless, by comparing the 46 banded chromosomes with a standard banding chart, an operator could assess a cell in 5-6 minutes. Automation translated this work into the following steps for each chromosome: digitization of the visual image, field segmentation, feature extraction, and classification. Pattern recognition for chromosomes related to quite a few other problems tackled by computer scientists at the time. Most prominent among these was the challenge of character recognition for both typescript and handwriting. This was of interest, for instance, for digital copying and there were commercial interests involved. The connection between character and chromosome reading is intriguing as chromosomes were often compared to letters, metaphorically and otherwise. Yet chromosome recognition turned out to pose bigger problems than character recognition because chromosomes could overlap and lie in different orientations. There was no measuring involved in character recognition and no interest in the density of letters or in translocations as was crucial in chromosome analysis. Different algorithms were developed to overcome these problems in chromosome analysis, yet humans continued to outperform machines. As Rutovitz put it in a recent interview: very soon it became clear ‘that human eyes can do tricks that are damn difficult to mimic with machines’. This confirms that measuring was not all that was involved in karyotyping. The solution to the problem of coming up with an automated system that would speed up the job of karyotyping was found in the development of an ‘interactive system’. The principle was that the computer would find the metaphases in the large number of cells that made out a probe and present the chromosomes to the human observer in the best organized way, thus facilitating his or her work. Using parallel computers speeded up the procedure further. As one of Rutovitz’s collaborators explained: ... the power of the [automatic karyotyping] system will undoubtedly be enhanced by supplementing the measuring power of the computer with man’s tremendous pattern recognition ability’. (Hilditch 1970, p. 298)

Rutovitz’s group also developed a system that allowed the user to point to a chromosome and move it away that was very innovative at the time and provided further ways for the human observer to interact with the machine and facilitate his or her work. The group eventually came up with a small interactive computer-controlled microscope with a screen-based editor that was successfully commercialized. Overall, the comparison with automated karyotyping highlighted the importance of pattern recognition performed by human observers and its role in gaining knowledge on chromosomes  

For more details see Rutovitz et al. 1978. Interview with Denis Rutovitz, Edinburgh, 1 April 2008.

59

Soraya de Chadarevian

despite the wide-spread use of figures for the accurate description and distinction of chromosomes.

Banding In the early 1970s chromosome banding came in as a new technique adding a wealth of new information for the characterization of chromosomes. The new staining technique, based on the use of the antimalarial drug quinacrine hypochloride, was pioneered by Torbjörn Caspersson and Lori Zech in Sweden, originally to study gene activity during development (later other banding stains were introduced). It soon became clear that the banding pattern was constant and characteristic for each chromosome. It could thus be used for the identification of chromosomes as well as for the detection of more and smaller deletions, inversions or translocations. Again, pattern recognition played a decisive role. An early paper by the Swedish group gave a detailed description of 24 different types of fluorescent patterns, each corresponding to one chromosome type and suited for direct visual identification. For chromosome 1 it reads: The distal part of the short arm is dark, followed by two bright bands closer to the centromere. In the long arm there is a very bright band in the middle and the distal part appears to have a rather even fluorescence (Caspersson, Lomakka & Zech 1971, pp. 94-95).

Similar descriptions for all other chromosomes followed. The descriptions were accompanied by microphotographs on black contrast and by graphs of photoelectrically recorded fluorescence patterns for each chromosome type. Later banded chromosomes were identified through side by side visual inspection and comparison with a banded chromosome chart, developed at the third human karyotype standardization conference in Paris in 1971. On my question if with banding humans worked differently than machines (for instance by recognizing patterns rather than measuring bands) Rutovitz’s answer was that the idea was always that machines would mimic humans but adding accuracy as well as speed. Yet he also said that for computers chromosomes were mathematical things while humans looked for outlines or patterns. Again, computers proved suitable for lining up chromosomes for visual inspection and for measuring the dimensions and distances of bands. Pattern recognition relied on statistical methods and apparently was less powerful than human inspection.

Conclusions Close attention to the changing practices involved in karyotyping provides insights into how knowledge on chromosomes was achieved and where the limitations lay. More importantly perhaps, it opens up questions on the place of visual objects and practices of viewing in knowledge making practices and on the shifting role of scientists, technicians and machines in producing that knowledge. Much more work on the empirical and analytical level needs to be done to spell out these implications.

Discussion Wall: It is intrigueging that humans are actually better at recognizing distinction between the chromosomes than machines. And looking at these chromosomes and the fact that they have arms, the fact that the long arm is put aforehead, that the decision is made to arrange that, to make an array. Is that an anthropomorphism going on? [laughther] Why not? Reverse them. I don’t know much about visual cognition as a science, but I am thinking we are wired to recognize human forms in great solidy, gate, shape, faces, much more than you’re looking on any other object in the world. So, is there some element of the human capacity to distinguish very finely

60

Viewing Chromosomes

among human forms which might be part of what makes it easy to recognize various chromosomes in different posters, or the ones who putting their arms up [laughter]. Anderson: I have a comment, because when you talking about visual pattern recognition you are talking exactly about the moment that goes into the issues of computers and cybernetics. That is visual pattern recognition because it is incredibly difficult, for just the reasons, to have a machine imitate how a human recognizes forms in the world. And there is a lot of literature in cybernetics – I am actually searching for some here – in which they are talking about such projects and the difficulty to develop such programs. de Chadarevian: If you think that measuring is important, then I think machines would outperform humans. But pattern recognition is still a very difficult problem, in the recognition of faces and things like that, humans still outperform machines. Metscher: I think the problem there is that a human can look at something that is a variation of a standard pattern in mind and classify it appropriately, and you can see any variation of that and recognize it. I never saw you before yesterday, but I can say it’s a human [laughter], without any problem at all. A machine doesn’t trouble doing that. de Chadarevian: Actually, the computer would look at the prints, but the people decided that that wasn’t good, that one should work straight down the microscopes because you could change the focus. Otherwise, you loose information, that was what they felt. You wanted to save time. The whole thing was to speed up the process, so humans could do it in 2-3 minutes, and the computer had to cut down on that time. Anderson: The first screening to visualize things was introduced, I think, about 1955. I actually have an article of the Institute for Research Engineering, or whatever that talks about it. That’s about 1955 when a service screen is connected to a computer for the first time.

61

Soraya de Chadarevian

Bibliography Painter, Theophilus S. 1923. “Studies in mammalian spermatogenesis II: The spermatogenesis of man” Journal of Experimental Zoology 37: 291-334. Human Chromosome Study Group. 1960. “A proposed standard of nomenclature of human mitotic chromosomes”. Cerebral Palsy Bulletin 2(3): 1-9. Rutovitz, Denis, D. K. Green, A.S. J. Farrow, and D. C. Mason. 1978. “Computer-assisted measurement in the cytogenetic laboratory” In Bruse G. Batchelor (ed.). Pattern Recognition. New York: Plenum Publishing Corporation, p. 303-329. Rutovitz, Denis. 2008. Interview in Edinburgh, 1 April 2008. Hilditch, Judith. 1970. The principles of a software system for karyotype analysis. In: Jacobs, P. A., W. H. Price and P. Law (eds). Human Population Cytogenetics, Baltimore: The Williams and Wilkins Company, pp. 297-325. Caspersson, Torbjorn, Gosta Lomakka, and Lore Zech. 1971. “The 24 fluorescence patterns of the human metaphase chromosomes – distinguishing characters and variability”. Hereditas 67: 89-102.

62

Scales of Line, Circle, and Arrow

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development Silvia Caianiello

The main characters of my narrative are two graphic representations that both played a pivotal role in establishing a new visual imagery for conceiving evolutionary and developmental change in the twentieth century. Notwithstanding their divergences, both in the time scale, as well as in the quite opposing directions of the processes they describe, they share some common features. They both aimed at visualizing the behavior of complex dynamic systems in terms of spatial relationships, and so aimed at being landscapes of “moving equilibria” (Wright 1931, p. 147; Waddington 1940, p. 92). Moreover, they share a basic constructional strategy, the collapsing of multidimensional phasespaces into three-dimensional diagrammatic representations; a choice that, although differently nuanced, was aimed at conveying complex processes in the immediacy of a visual device. Furthermore, beyond the explicit primacy of a communicative as well as a didactic intention, the major feature they have in common is the epistemic status of models, whose efficacy in this sense is testified to, at least wirkungsgeschichtlich, by the subsequent genealogy of models still referring to them as their ancestors, while also acting at the same time as images in the wider sense I just mentioned. The further and formal main link between the two representations is a genealogical one. By adopting the idea of landscape, Waddington deliberately referred to Wright’s visual device, and I will try to work out some hypothesis about this historical connection in the context of the relationship between evolutionary and developmental research, in a span of time – 1932 to 1957 – that extends from the rise of Modern Synthesis to the beginning of its “hardening” (Gould 1983). I will not directly tackle the question of the ambiguity of their status between “algorithm and image,” and the ways they exceeded their original modelling function, but I hope to contribute somehow to the exploration of the imagery repertoire they drew upon, which might be assumed to be another possible factor substantiating the autonomy of models both from theories and data.

Evolutionary progress The adaptive landscape was the expedient by which Wright – as he had been explicitly requested – presented his evolutionary theory in a concise and not technical way in 1932 (Provine 1986, p. 275). His model is therefore, first of all, a “model for” his shifting balance theory, which proved useful from the start as a general tool for visualizing other possible modellings of the evolution of Mendelian populations. More generally, it became the favored visual tool for the new style of 

 



See Wright 1932, p. 163-165; a discussion of the consequences of this collapsing of dimensions was conducted in his correspondence with Fisher in 1931, see Provine 1986, pp. 272-275; Waddington 1940, p. 27. Gayon 1998, p. 341, with reference to Wright’s landscape. See M. Morrison 1999; see also Sismondo 1999; de Chadarevian and Hopwood 2004; Levins 1966. The notion of “model for” relates to Keller 2000. Skipper 2004 emphasizes the role of the adaptive landscape model as a tool for evaluating different evolutionary theories. I would rather support the broader view that it was primarily also a tool for

65

Silvia Caianiello

biomathematical modelling, which provided the reconciliation between Mendelian genetics and biometrical statistics, which is at the core of Modern Synthesis. As such, its historical relevance could perhaps be compared to the branching trees of nineteenth-century Darwinism. As Dobzhansky observed, this reconciliation was mainly brought about by three mathematical geniuses, Fisher, Haldane, and Wright, whose work placed biological science in general, and evolutionary genetics in particular, “in the position in which physics has been for many years.” The reduction of quantitative traits to the interaction among Mendelian particulate genes removed the last obstacle at representing evolution as the behavior of a population composed of atomized entities whose frequency was determined by the action of opposing forces, mainly natural selection and mutation. This original conception, which received a complete formulation in Fisher’s Fundamental Theorem, relied heavily on an explicit analogy drawing upon Boltzmann’s statistical mechanics of ideal gases, and secondarily upon the second principle of thermodynamics, where mutation played the role of entropy and natural selection the role of the main counterentropic factor. This approach is also characterized, somewhat paradoxically, by a highly probabilistic stance as well as by an evident foundationalist intention, the one explicitly celebrated in Dobzhansky’s commentary. As has been observed, the goal of the founders of mathematical population genetics was “the theoretical-mathematical refoundation of biological science more than the application of mathematical techniques to particular contexts.” This is relevant in so far as Fisher and Wright soon agreed that the divergences in their theories were mostly related not to quantitative issues, but to qualitative stances. The many differences characterizing their scientific backgrounds and approaches amounted to an evident divide in their representational horizons, even within the shared context of a thermodynamic imagery. In his statistical mechanical analogy, Fisher evidently starts with a very large, panmictic population of atomistic genes, as the one modelled in Hardy-Weinberg equilibrium, to further manipulate its parameters in order to test and establish the prevailing power of natural selection on its evolution. According to Wright, on the contrary, the representation of the population structure that better “agrees with the apparent course of evolution in the majority of cases” is the case of “a large species which is subdivided into many small local races, each breeding largely within itself but occasionally crossbreeding,” as portrayed in the last of the six possible landscapes presented in his 1932 paper (p. 166).





 

66

visualizing them and evolutionary processes in general. Dobzhansky 1962, p. 500-501, quoted in Provine 1986, p. 277: “Genetics is the first biological science which got in the position in which physics has been in for many years. One can justifiably speak about such a thing as theoretical mathematical genetics, and experimental genetics, just as in physics. There are some mathematical geniuses who work out what to an ordinary person seems a fantastic kind of theory. This fantastic kind of theory nevertheless leads to experimentally verifiable prediction, which an experimental physicist then has to test the validity of. Since the times of Wright, Haldane and Fisher, evolutionary genetics has been in a similar position.” Hodge 1992 distinguishes analytically between the two and the different contexts in which Fisher makes use of them. Fisher was nonetheless keenly aware of the many disanalogies between evolution and thermodynamic processes. For a comprehensive assessment of the role of Fisher’s Fundamental Theorem in evolutionary theory, see Plutynski 2006. See Israel 2004 and 1993; see also Hodge 1992, pp. 287-288. Fisher 1930, p. 35; see Provine 1992, p. 149 and Skipper 2009, p. 311-312. Plutynski 2006, in a recent reassessment of Fisher’s Fundamental Theorem, denies that such assumptions are central to Fisher’s research program, basing mostly on later writings.

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development



Figure 1: “Field of gene combinations occupied by a population within the general field of possible combinations”. From Wright 1932, fig. 4, p. 166.

This qualitative feature in the representation of the ideal conditions for the evolution of Mendelian populations is actually the result of Wright’s life-long experimental practice in physiological genetics and inbreeding, as well as of the wider research tradition on geographic speciation, which had established the primary relevance of nonadaptive morphological characters differentiating related species. But, at a different level, a conspicuous number of Spencerian metaphors can also be detected in his way of representing evolutionary change, such as the idea of a moving equilibrium resulting from the three-phases balancing process, whereby evolution comes to be depicted as a journey (or a walk) across adaptive hills and valleys – although with the crucial difference, with regard to Spencer, that here chance plays a new, creative role as the main factor preventing populations from sticking at less optimal peaks.10 Nonetheless, as far as the structure of the population is concerned, a background role might also have been played by Spencer’s view that social growth, as well as biological growth, which involves an increase not in numbers but in organizational complexity, “results only by combination of these smaller societies; which occurs without obliterating the divisions previously caused by separations” (Spencer 1877, pp. 226-228), very similar to Wright’s view of genetic differentiation occurring in the subgroups and later spreading out through migration and crossbreeding of the fitter genotypes. One of Wright’s main concerns in building his adaptive landscape was to emphasize, in an intuitive fashion, the major role played by epistatic interactions among genes – an idea as crucial to his theory as it was statistically immaterial in Fisher’s representation of large panmictic

 10

Provine 1986, Chap. 7, reconstructs the persistent controversies about this issue since Darwin. See Gayon 1998; Ruse 1996 and 2004; Steffes 2007. The most macroscopic Spencerian influence is, however, the subsumption of the different factors involved in the balancing process into the two major Spencerian categories of factors endorsing homogeneity and those endorsing heterogeneity.

67

Silvia Caianiello

populations. Thus the direct inspiration for his model was a 1931 article by Haldane that included a diagrammatic representation of selection upon two epistatic genes.11



Figure 2: Illustrating a case of “Slow selection involving more than one gene”. From Haldane 1932, fig. 10, p. 107.

The result demonstrated the presence of two stable states of equilibrium separated by an unstable state. Wright’s invention was the addition of a fitness surface, which highlights the stable states as hills and the unstable ones as valleys. The relevance of epistatic interactions in making the adaptive landscape rugged has been further emphasized in Kauffman’s model, outlining a range from the lowest degree of epistasis, resulting in the Fujiyama landscape – rather resembling Fisher’s model – up to the highest degree of epistasis, resulting in a wildly rugged and indented landscape.12

11

Haldane 1931, pp. 139-141. A much clearer account of the image and its underlying assumptions for the layman is to be found in the Appendix “Outline of the Mathematical Theory of Natural Selection” to Haldane 1932, pp. 106-108, in a paragraph entitled: “Slow selection involving more than one gene.” 12 Figure 4, reproduced by courtesy of McGhee (see McGhee 2007, p. 20, Figure 2.12), is applied to morphological traits, not to gene frequencies as in the original “NK fitness landscapes” (Kaufmann 1995). Kaufmann, unlike Wright, provides his model with a multidimensional extension, employing computer simulations. McGhee 2007 establishes and explores the continuity between evolutionary landscapes and the latest developments in theoretical morphospaces modelling.

68

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development



Figure 3: “A hypothetical adaptive landscape”, representing adaptive morphologies, “portrayed as a three-dimensional grid at the top of the figure and a two-dimensional contour map at the bottom“. From McGhee 2007, fig. 1.1, p. 2. Courtesy of G.R. McGhee.



Figure 4: “Contrasting topologies of adaptive landscapes. The dashed line depicts a ‘Fujiyama’ landscape, with a single adaptive peak with very high adaptive value, versus the solid line depicting a ‘rugged’ landscape, with multiple peaks of varying height but all of much lower adaptive value than the Fujiyama peak”. The original Kauffman’s 1995 model did not include any picture. From McGhee 2007, fig. 2.12, p. 20. Courtesy of G.R. McGhee.

69

Silvia Caianiello

There is, however, a trick about Wright’s adaptive landscape that led Provine to discard it as “meaningless in any precise sense.”13 While in Haldane’s diagram each dot represents a population, Wright switches the meaning of the dots on the surface to representing gene combinations, that is, single genotypes, which Provine has argued could never result in a continuous surface of any kind. Such a mindless blurriness in the definition of a crucial parameter in an outstanding mathematical genius is extremely interesting, the more so as this genotypic version of the landscape has been the most fruitful, especially in its impact upon the founders of Modern Synthesis, such as Dobzhansky and Simpson. It is difficult to resist the temptation to look for some hidden reason for such a macroscopic flaw and imagine that it might have been prompted by Wright’s urge to convey a qualitative feature in his model.14 My guess is that he was trying to fill visually the gap between the qualitative implications of his theory and the necessary mathematical simplification of its actual extension. This surmise could account for the fact that Wright received so favorably the apparent betrayals of his own model, such as by Dobzhansky, who eventually made of the dots entire species, or Simpson who made them represent morphological traits (Simpson 1944, Dobzhansky 1951). Wright’s most general conclusion of his 1932 paper, “that evolution depends on a certain balance among its factors,” together with the further specification that such a balance is also to be found “at all levels of organization” (Wright 1932, p. 170), could suggest that he considered these wider applications of his model, rather than as misunderstandings, as veritable amplifications, reaching, by means of a metaphorical escalation, further organizational levels in the biological hierarchy.15

Developmental resilience If we look at the origin of Waddington’s model of epigenetic landscape, we find a much longer gestation. I can’t illustrate here in detail what I consider to be the first paragraph of the visual history of Waddington’s epigenetic landscape, the images proposed by Needham in Order and Life of 1936. The alleged Waddington’s cones – alleged, because Waddington never actually drew them – that Needham introduces provide him an entry into biomathematical modelling, mainly inspired by Lotka, to conceive of development, or “progressive restriction of potencies,” as a cascade of states moving from the unstable toward the more and more stable (Needham 1936, pp. 58-59). What is relevant in Needham’s evocation of Lotka’s diagrams is that in the three-dimensional visualization, equilibrium is represented as the bottom of a valley – the exact opposite of Wright’s quite progressivist peaks, and a foreshadowing of the fashion by which Waddington would visualize equilibria in his epigenetic landscape. But the resemblance of this three-dimensional image to Waddington’s slopes and valleys is actually deceptive, as it is obviously the reverse, uphill movement that the line of the valley is depicting here. That is why Needham, in appropriating this visual idea, first of all reverses the movement, from uphill – or from the homogeneous, unstable state – towards downhill or the more differentiated steady state, as it is to be seen in his volcanolike image of development.

13

Provine 1986, p. 310. He refers to his discussion of this early flaw with Wright, who eventually corrected it in later writings. 14 See Gayon 1998 for a similar suggestion. 15 Wright 1953; see Hodge 1992 and Steffes 2007.

70

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development



Figure 5 A: “Map of integral curves for the Ross malaria equations”and model of surface corresponding to the former figure. From Lotka 1924, fig. 28, p. 149.

71

Silvia Caianiello



Figure 5 B: “Map of integral curves for the Ross malaria equations”and model of surface corresponding to the former figure. From Lotka 1924, fig. 29, p. 150.



72

Figure 5 C: Represents an application of Lotka’s model for the case of development: “Qualitative three-dimensional model of embryonic determination. Illustrating the passage from unstable to stable equilibria”. From Needham 1936, fig. 11.

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development

Waddington’s pioneering attempt at reunifying the divided fields of genetics and embryology has received increasing attention by evolution researchers, particularly in the recent discipline of EvoDevo.16 More specifically, in 1991, Gilbert dedicated a paper to the origin of the epigenetic landscape model, relating it mainly to the tradition of embryological diagrams of cell lineages in experimental embryology, which resulted in a branching track of differentiating cells. In 1939, in the diagram representing pigment formation in the eye of Drosophila, Waddington indeed created an interesting substitution; instead of cell divisions, bifurcations represent the alternative action of genes and their products that determine the fate of the cells. So, the “branching system of lines symbolizes all the possible ways of development controlled by different genes” (Waddington 1939, p. 182). Gilbert’s general point is that the epigenetic landscape is but an extension of this original track system, and that by remaining faithful to this visual tradition, Waddington was emphasizing his intention of unifying the two fields.17 As a matter of fact, however, Waddington states rather clearly that if a new symbolic representation is needed, it is precisely because of the inadequacy of the branching track one. In 1939, Waddington introduces therefore a “valley model” or geological model of gene reactions, of which he provides at the time a purely verbal description, although the analogy with the “probability surface” of Wright’s landscape (also called here “valley model”) is already acknowledged, without further argumentation, in the framework of the long section dedicated to the work of Fisher, Haldane and Wright (Waddington 1939 p. 183 and 293 ff). The question at stake is namely that although single genes “often define alternative courses along which the [developmental] reactions may go [...] the course of any developmental process is determined by many genes,” that is “by the whole genotype.”18 This is at least the explicit reason why Waddington turned to the landscape representation in 1939 as well as in Organisers and Genes in 1940, always cursorily acknowledging the connection to Wright’s model. Moreover it is not surprising that, exactly because of his commitment to the unification of genetics and embryology, he might have been willing to keep to the same representational horizon as mainstream mathematical evolutionary genetics. It was only in 1957 however that an image was added to the text to visualize the embedding of the single branching track into a system constituting the “genotypic milieu.”19 Here, the visualization of the genetic interaction pervading the whole genotype possesses a genuine intuitive appeal: These pegs represent the genes, and the tensions on the guy ropes the chemical forces which the genes exert. As the diagram indicates, the course and slope of any particular valley is affected by the chemical tendencies of many genes; and if any gene mutates, altering the tension in a certain set of guy ropes, the result will not depend on that gene alone, but on its interaction with all other guys. (Waddington 1957, p. 35)

16

See Alberch 1980; Hall 1992, 2004, 2008; Slack 2002; Gilbert 1991, 2000. Gilbert 1991, pp. 141-142. Hall 1992 shows a similar kind of attitude towards Waddington’s endeavor, attributing it to the integration of embryology and genetics rather than of embryology and evolution. Their interpretation of Waddington’s role appears more wide-ranging in later writings however. 18 Waddington 1939, pp. 182-183; in Waddington 1940, p. 83, he rejects even more sharply the image of single branching tracks as “rather a clumsy method of expression,” as “the course of each branch of the complex track is controlled [...] by the whole genotype or the greater part of it.” 19 Waddington 1939, p. 163 ff., with reference to Timofeeff-Ressovsky 1934. 17

73

Silvia Caianiello



Figure 6: “The complex system of interactions underlying the epigenetic landscape”. From Waddington, Conrad H. 1957, The Strategy of the Genes. London: George Allen & Unwin Ltd., fig. 5, p. 36.

My working hypothesis is that, in The Strategy of the Genes in 1957, the epigenetic landscape enhances its status as a visual device, exactly because it serves a new function, that of a visual instrument no longer just for merging evolutionary and developmental theories, but for revising the former by means of the latter; the same stance – as Gilbert has shown – that led to Dobzhansky actually promoting Schmalhausen’s formulation of the principle of stabilizing selection rather than Waddington’s.20

20

74

Gilbert 1994. But the actual extent of Waddington’s impact on the Modern Synthesis workers appears to be much broader and subtler, as, for instance, some of Mayr’s references to Waddington in Mayr 1963 indicate. (Whether such a link could be interpreted as the clue to a common organicism, sensu El-Hani and Emmeche 2000, is a further question). According to Schubert 1985, Waddington had elaborated his criticism of Modern Synthesis as early as 1940, and the delay in its formulation was mainly due to the war. The coincidence of these dates with the formalization of the epigenetic landscape seems to fit my scheme.

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development

In this respect, it appears rather intriguing that only by assuming this new function does the epigenetic landscape really turn into a model, against the background of a new sophisticated epistemological stance concerning the function of models, which actually dismisses any foundationalist implication in favor of an exploratory one (Waddington 1957, p. 249). In fact, in 1939, Waddington did not include any visualization of the epigenetic landscape, and when he finally did, in the frontispiece of Organisers and Genes, it was not as a diagrammatic representation but as a pictorial one, provided by his friend the painter John Piper.



Figure 7: “Pictorial” epigenetic landscape. From Waddington, Conrad H., 1940, Organisers & Genes. Cambridge: Cambridge University Press, Frontispiece (drawing by John Piper).

The choice of Piper is a sort of redundant one of commissioning the presentation of a symbolic landscape to a painter of real landscapes. In his later book Behind Appearances dealing with the “relations between painting and natural science” in the antirealistic and indeterministic context of the “Third Science,” Waddington projected into Piper a still deeper affinity, and the terms by which he described Piper’s visual poetics appear suprisingly similar to the system dynamic language he employed for introducing his landscape”.21 21

Waddington 1969; for the definition of “Third Science,” see p. 5 ff. Among the British “Geometricizers,” he confesses to a personal preference for Piper (p. 58), but it is the general characterization of the whole group’s poetics that resounds astonishingly with the kind of linguistic images he employs for his epigenetic landscape: “Putting it in the broadest way, they saw the external world in terms of equilibria which resolved tensions; and these tensions operated in a world of objects which were refractory, hard,

75

Silvia Caianiello

The overall conception deployed in 1969, emphasizing the creative and imaginative character of scientific activity, can retrospectively help clarify the theoretical approach to modelling already on display in the Strategy of the Genes. Models provide neither explanations nor proofs, but may “lead the re-orientation of experimental approaches,” and also help reformulate the “new problems” emerging from the “newer methods and techniques” (Waddington 1957, p. 249). And actually, in this same book, Waddington devotes much energy to testing the exploratory power of different models in the search for a visual representation of canalized development. In the chapter entitled “The cybernetics of development,” he first significantly evokes Galton’s model of organic stability (without including the original diagram), a kind of “mechanic” model for explaining the tendency of organic structures to re-establish themselves in the former position of equilibrium, as long as a precise “range” is not “overpassed.”22 Then he tests the possibility of building, after Ashby, a veritable cybernetic model for development, complete with a graphic device, particularly fit for describing the self-regulatory properties of developmental systems and their threshold mechanisms. But, in the end, Waddington dismisses both Galton’s mechanostatistical model as Ashby’s cybernetic one, and comes back to the epigenetic landscape (Waddington 1957, p. 13-27). Why does Waddington return to the epigenetic landscape, in a section invoking cybernetics as the domain into which it is desirable to embed developmental biology? I think that, of the several reasons that prompted him to stick to the epigenetic landscape, a primary one is connected to his criticism concerning the mathematical foundations of synthetic theory, in which Wright appears to play a new, strategic role.23 This polemical intention is introduced by contrasting the “geometrical mode of expression,” appropriate to the epigenetic landscape with the algebraic one. The general flaw of the “mathematical theory” of evolution is precisely its “algebraic formulation” of the selective value of a single allele “whatever its frequency in the population and whatever other genes it is combined with.” But in this context, Wright, although “responsible for much of the theoretical work in [that] field,” is presented as the only researcher who took “into account the necessity to argue in terms of complete genotypes rather than individual alleles” (Waddington 1957, p. 84). Thus Wright’s landscape, with his topologic, geometric effort, was credited as challenging the atomistic notion of fitness, as far as it emphasized “the importance of [...] different adaptive facies.” In Waddington’s view, fitness is namely a systemic property, “a quality of the organism as a whole” that cannot be accounted for by simply breaking it down “into a number of immediate components” (Waddington 1957, p. 110). But Wright, although focusing on the species rather than on the organismic level of analysis, had also emphasized a few years earlier that “natural selection” is actually an “abstraction of the complicated reciprocal process” linking the species and the

dense, difficult to work, the contrary of ductile, flowing, emollient. [...] This sense of the real world as an area of interlocking energies, of a line as a path of minumum energy through orderly fields of force is of course an expression of some of the basic notions of modern science” (p. 55; italics mine). A later development in Waddington’s attitude to art is deployed in Tools for Thought (1977) where he lets his text be punctuated by creative and often humourous illustrations by the artist Yolanda Sonnabend (a fashion recently reappraised by Jablonka and Lamb 2005, with illustrations by Anna Zeligowski). 22 Galton 1889, Chap. III, p. 27; where his definition of “mechanic” stability is actually more in the sense of statistical mechanics than of machinery. See also Wilkie 1955. 23 In 1957, the evaluation of Wright’s work was more significant than in 1939, where Waddington (Waddington 1953) rather seemed to incline towards Fisher’s interpretation of the evolution of dominance. For Waddington’s criticism of population genetics in 1953, partly echoed by Mayr, see Provine 1986, p. 280.

76

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development

world, situated “in a general trend through the progressive elaboration of the pattern of organization of both” (Wright 1948, p. 538). The major step that even Wright however misses, according to Waddington, is not having envisaged the crucial relationship between “changes in gene frequencies” and “phenotypic characters of organisms” – a critique that will constitute the bulk of his ongoing argument against contemporary evolutionary theory.24 An interesting aspect of the shift in Waddington’s use of the epigenetic landscape in 1957 is that it entails a sort of double movement, which has the effect of drawing it nearer to Wright’s landscape. First of all, by reducing the distance between the professed quantitative scope of Wright’s model and his own, which he always presented as merely symbolic and qualitative. He dismisses Wright’s epistemic pretensions, downsizing his model to a mere “qualitative expression of the possibilities,” which has to be supported by “an examination of the actual phenomena in Nature” (Waddington 1957, p. 84-85). But, at the same time, Waddington introduces the epigenetic landscape in a new, diagrammatic version, and employs it as a visual device for representing the behavior of different developmental/epigenetic systems, and especially the particular features of genetic assimilation, endowing it finally with all the properties of a model, that is a framework as valuable as Wright’s landscape for depicting developmental – instead of evolutionary – possibilities.



24

Figure 8: Representation by means of epigenetic landscapes of “‘Organic selection’ (the Baldwin effect) and genetic assimilation”. From Waddington 1957, The Strategy of the Genes. London: George Allen & Unwin Ltd., fig. 30, p. 167.

Waddington 1957, p. 110. Waddington 1969, while committing himself to an even clearer appreciation of Wright’s peculiar position in Evolutionary Mathematical Genetics, inasmuch as he dealt with the “multiple gene situation,” formulates the same criticism in a more straightforward fashion: Wright did conceive his fitness surface as a genotypic instead of a phenotypic space. See also Waddington 1975, p. 280: “In my opinion the conventional Neo-Darwinian theories of Haldane and Fisher (and to a lesser extent, Sewall Wright) are inadequate both because they leave out the importance of behaviour in influencing the nature of selective forces, and because they attach coefficients of selective value directly to genes, whereas really they belong primarily to phenotypes and only secondarily to genes.”

77

Silvia Caianiello

From a contemporary perspective, perhaps, a more general, positive continuity between these first evolutionary and developmental landscapes stands out beyond the macroscopic opposition of the processes they describe. They both define a space of evolutionary possibilities for modelling the narrowing of possible trajectories of “moving equilibria” throughout the system. Whereas in Wright’s view the evolutionary possibilities of a population are constrained by its structure and size, Waddington, with his notion of “resistance to developmental change” foreshadows the concept of “developmental constraints,” which acts at the same time as a “resistance to evolutionary change,”25 thereby further affecting the possible directions of evolution.

25

78

Waddington 1955; for Waddington’s role in the genesis of the notion of “developmental constraint,” see Maynard Smith et al. 1985, p. 60.

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development

Bibliography Alberch, Pere. 1980. “Ontogenesis and Morphological Diversification’’. American Zoologist 20: 635-667. de Chadarevian, Soraya, and Nick Hopwood. 2004. Models : the Third Dimension of Science. Stanford, California: Stanford University Press. Dobzhanski, Theodosius. 1951. Genetics and the Origin of the Species. 3d. ed. New York: Columbia Universiy Press. Dobzhanski, Theodosius 1975. Reminiscences of Theodosius Grigorievich Dobzhansky. (Transcript of interviews conducted for the Oral History Research Office of Columbia University in 1962 and 1963). Glen Rock, NJ: Microfilming Corp. of America. El-Hani, Charbel Niño, and Claus Emmeche. 2000. “On Some Theoretical Grounds for an Organism-centered Biology: Property Emergence, Supervenience, and Downward Causation’’. Theory in Biosciences 119 (3-4): 234-275.

Fisher, Ronald Aylmer. 1999. The genetical theory of natural selection (1930). A complete variorum edition, ed. By J. H. Bennett. Oxford: Oxford University Press. Galton, Francis. 1889. Natural Inheritance. London: Macmillan. Gayon, Jean. 1998. Darwinism’s Struggle for Survival: Heredity and the Hypothesis of Natural Selection. Cambridge, U.K. and New York: Cambridge University Press. Gilbert, Scott F. 1991. “Epigenetic Landscaping: Waddington’s Use of Cell Fate Bifurcation Diagrams’’. Biology and Philosophy 6: 135-154. Gilbert, Scott F. 1994. “Dobzhansky, Waddington and Schmalhausen: Embryology and the Modern Synthesis’’. In G. E. Allen, M. B. Adams (eds.) The Evolution of Theodosius Dobzhansky. Princeton, NJ: Princeton University Press. Gilbert, Scott F. 2000. “Diachronic Biology Meets Evo-Devo: C. H. Waddington’s Approach to Evolutionary Developmental Biology’’. American Zoologist 40:729-737. Gould, Scott J. 1983. “The Hardening of Modern Synthesis’’. In: Grene, Marjorie (ed.) Dimensions of Darwinism. Cambridge: Cambridge University Press. Haldane, John Burdon Sanderson. 1931. “A Mathematical Theory of Natural Selection. Part VIII. Metastable Populations’’. Proceedings of the Cambridge Philosophical Society 27:137-142. Haldane, John Burdon Sanderson. 1990 [1932]. The Causes of Evolution. Reprint. Princeton, New Jersey: Princeton University Press. Hall, Brian K. 1992. “Waddington’s Legacy in Development and Evolution’’. American Zoologist 32(1):113-122. Hall, Brian K. 2004. “In Search of Evolutionary Developmental Mechanisms: The 30-Year Gap Between 1944 and 1974’’. Journal Of Experimental Zoology (Mol Dev Evol) 302B: 5-18. Hall, Brian K. 2008. “Waddington, Conrad Hal’’. In: Dictionary of Scientific Biography Vol. 25. Detroit: Charles Scribner’s Sons, p. 201-207.

79

Silvia Caianiello

Hodge, Michael Jonathan S. 1992. “Biology and Philosophy (Including Ideology): A Study of Fisher and Wright’’. In: Sarkar, Sahotra (ed.). The Founders of Evolutionary Genetics : A Centenary Reappraisal. Dordrecht and London: Kluwer Academic, p. 231-293. Israel, Giorgio. 1993. “The Emergence of Biomathematics and the Case of Population Dynamics: A revival of Mechanical Reductionism and Darwinism’’. Science in Context 6(2): 469-509. Israel, Giorgio. 2004. “La matematizzazione della biologia e la biomatematica’’. Storia della Scienza, ed by S. Petruccioli. Vol. VIII, chap. 37, p. 288-293. Roma, Istituto della Enciclopedia Italiana. Jablonka, Eva, and Marion J. Lamb. 2005. Evolution in Four Dimensions: Genetic, Epigenetic, Behavioral, and Symbolic Variation in the History of Life. Cambridge, Mass.: MIT Press. Kauffman, Stuart. 1995. At Home in the Universe: The Search for the Laws of Self-Organization and Complexity. New York: Oxford University Press. Keller, E. Fox. 2000. “Models of and Models for: Theory and Practice in Contemporary Biology”. Philosophy of Science 67, Supplement. Proceedings of the 1998 Biennial Meetings of the Philosophy of Science Association. Part II, p. 72-86. Levins, Richard. 1966. “The Strategy of Models Building in Population Biology’’. American Scientist 54: 421-431. Lotka, Alfred J. 1924. Elements of mathematical biology. New York: Dover. Mayr, Ernst. 1963. Animal Species and Evolution. Cambridge, Mass.: Belknap Press of Harvard University Press. McGhee, George R. 2007a. The Geometry of Evolution: Adaptive Landscapes and Theoretical Morphospaces. Cambridge, UK: Cambridge University Press. Morrison, Margaret. 1999. “Models as Autonomous Agents’’. In: Morgan, Mary S. and Margaret Morrison (eds.) Models as Mediators. Perspectives on Natural and Social Sciences. Cambridge and New York: Cambridge University Press. Needham, Joseph. 1936. Order and Life. Oxford: Clarendon Press. Plutynski, Anya. 2006. “What was Fisher’s Fundamental Theorem of Natural Selection and what was it for?’’. Studies in History and Philosophy of Biological and Biomedical Sciences 37 (1): 59-82. Provine, William B. 1986. Sewall Wright and Evolutionary Biology. Chicago and London: University of Chicago Press. Provine, William B. 1992. The Origins of Theoretical Population Genetics. Chicago and London: The University of Chicago Press. Ruse, Michael. 1996. “Are Pictures really necessary? The Case of Sewall Wright’s Adaptive Landscapes’’. In: Baigrie, Brian S. (ed.). Picturing Knowledge: Historical and Philosophical Problems Concerning the Use of Art in Science. Toronto and London: University of Toronto Press, p. 303-337. Ruse, Michael. 2004. “Adaptive Landscapes and Dynamic Equilibrium: The Spencerian Contribution to Twentieth-Century American Evolutionary Biology’’. In: Lustig, A., R. J.

80

Adaptive versus Epigenetic Landscape. A Visual Chapter in the History of Evolution and Development

Richards, and M. Ruse (ed.). Darwinian Heresies. Cambridge: Cambridge University Press, p. 144 (131)-150. Schubert, Glendon. 1985. “Epigenetic Evolutionary Theory: Waddington in Retrospect’’. Journal of Social and Biological Structures 8: 233-253. Simpson, George G. 1944. Tempo and Mode of Evolution. New York: Columbia University Press. Sismondo, Sergio. 1999. “Models, Simulations, and their Objects’’. Science in Context 12(2): 247-260. Skipper, Robert A. Jr. 2004. “The Heuristic Role of Sewall Wright’s 1932 Adaptive Landscape Diagram’’. Philosophy of Science 71 (5). Proceedings of the 2002 Biennial Meeting of the Philosophy of Science Association. Part II: Symposia Papers, p. 1176-1188. Skipper, Robert A. Jr. 2009. “Revisiting the Fisher-Wright Controversy”. In: Cain, Joe and Ruse Michael (eds.). Descended from Darwin: Insights into the History of Evolutionary Studies, 19001970. Philadelphia: PA: American Philosophical Association Press, p. 299-322. Slack, Jonathan M. W. 2002. “Conrad Hal Waddington: The Last Renaissance Biologist’’. Nature Reviews Genetics 3: 889-895. Smith, J. Maynard, R. Burian, Stuart Kauffman, P. Alberch, J. Campbell, B. Goodwin, R. Lande, D. Raup, and L. Wolpert. 1985. “Developmental Constraints and Evolution: A Perspective from the Mountain Lake Conference on Development and Evolution’’. Quarterly Review of Biology 60(3): 265-287. Spencer, Herbert. 1877. The Principles of Sociology. 2nd ed. vol. I. London: Williams & Norgate. Steffes, David M. 2007. “Panpsychic Organicism: Sewall Wright’s Philosophy for Understanding Complex Genetic Systems’’. Journal of the History of Biology 40: 327-361. Timofeeff-Ressovsky, Nikolaj W. 1934. “Über den Einfluss des genotypischen Milieus und der Aussenbedingungen auf die Realization des Genotyps”. Nachrichten der Gesellschaft der Wissenschaft zu Göttingen 163:53-104. Waddington, Conrad H. 1939. An Introduction to Modern Genetics. London: George Allen & Unwin Ltd. Waddington, Conrad H. 1940. Organisers & Genes. Cambridge: Cambridge University Press. Waddington, Conrad H. 1953. Epigenetics and evolution. In: Evolution, R. Brown and J. F. Danielli (eds). Soc. Exper. Biol. Symposium 7. pp. 186-199, Cambridge UP. Waddington, Conrad H. 1955. “The Resistance to Evolutionary Change’’. Nature 175(4445): 51-52. Waddington, Conrad H. 1957. The Strategy of the Genes. London: George Allen & Unwin Ltd. Waddington, Conrad H. 1969. Behind Appearance: A Study in the Relations between Painting and Natural Science in this Century. Edinburgh: Edinburgh University Press. Waddington, Conrad H. 1975. The Evolution of an Evolutionist. Edinburgh: Edinburgh University Press.

81

Silvia Caianiello

Waddington, Conrad H. 1977. Tools for Thought. London: Cape. Wilkie James S. 1955. “Galton’s Contribution to the Theory of Evolution with Special Reference to His Use of Models and Metaphors”. Annals of Science 11(3): 194-205. Wright, Sewall. 1931. “Evolution in Mendelian Populations’’. Genetics 16: 97-159. Wright, Sewall. 1932. “The Roles of Mutation, Inbreeding, Crossbreeding and Selection in Evolution’’. I Proceedings of the Sixth International Congress of Genetics 1 (1932): 356-66. In: Wright 1986, p. 161-171. Wright, Sewall. 1948. “Evolution, Organic’’. In Encyclopaedia Britannica, 14th ed. revised 8: 915929. Republished. In: Wright 1986, p. 522-538. Wright, Sewall. 1953. “Gene and Organism’’. American Naturalist 87(832): 5-18. Wright, Sewall. 1986. Evolution – Selected Papers. Chicago : University of Chicago Press.

82

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy Ariane Dröscher



Figure 1: The first electron micrograph that showed a distinct Golgi apparatus and that was identified as such. From: Dalton, Albert J. and Marie D. Felix. “Cytologic and cytochemical characteristics of the Golgi substance of epithelial cells of the epididymis – in situ , in homogenates and af ter isolation”. American Journal of Anatomy 94, 1954: 171-207, plate 7. Reprinted with permission of John Wiley & Sons, Inc.

Introduction Like all scientific images, electron micrographs have an ambivalent nature: they are at the same time natural images and mental models; they are samples of nature and (more or less) intentionally produced artefacts. As the former, they should be as realistic and as little theory-driven as possible; as the latter, they should be as representative and as abstract as possible. All types of scientific images move within the innumerable gradations between these two extremes. My contribution will focus on the role of one historical image that closed an era of cytological inquiry by opening a new one, though being itself overtaken nearly immediately. In this way I will show the advantages as well as the shortcomings of single research techniques, i.e. I will show the partial nature of the knowledge they produce, and the relationship between the (scientifically) visible world and the real world in the realm of microscopy.

83

Ariane Dröscher

Structure The first reliable electron micrograph of the Golgi apparatus was published in 1954 by Albert Dalton and Marie Felix of the National Cancer Institute in Bethesda (Dalton and Felix 1954). Even before being officially published, the image was presented by Brontë Gatenby at the Golgi Apparatus Symposium held by the Royal Microscopical Society (Gatenby 1955) because he considered it spectacular. Indeed, it represented a big surprise. It contradicted nearly all theoretically derived expectations. Firstly, for many cytologists, it was a surprise that the Golgi apparatus really existed. During the 1940s, Golgi apparatus research had experienced a serious crisis. Most cell researchers believed that the traditional Golgi techniques had only produced mere artefacts or, at least, they did not consider the obtained micrographs sufficiently concrete to base their research on. Secondly, and even more astonishing, the organelle’s ultrastructure was radically different from nearly all conceptions from the era of light microscopy. Throughout its history, the Golgi apparatus had presented considerable trouble to microscopists and histochemists. It was not visible in living or fresh cells, with or without vital dyes; the only way to make it visible required the use of heavy fixatives followed by silver or osmium impregnation. Even once these techniques had been mastered, it was nearly impossible to pin down this tiny structure: Slight modifications of concentrations or reaction times yielded different images. Its morphological and topographical nature could be very similar in different tissues and very different in similar tissues, and even the same tissue of the same species treated with exactly the same method could produce a different image due to the different age or physiological state of the research object. Its pleomorphy undermined any attempt to define (or at least homologize) the curious structure when applying criteria of morphology, topography, or function, and cytologists agreed to employ a biochemical criterion and to call all structures responding in a similar way to silver- or osmium-impregnation Golgi. Early electron microscopy failed to show anything within the cell that could have been interpreted as the electronic homologue to the Golgi apparatus of light microscopy. The situation changed in 1952 when George Palade introduced veronal-acetate (pH 7.2) as a buffer for fixatives containing osmium (Palade 1952). Two years later Albert Dalton and Marie Felix published their micrograph of the Golgi apparatus, causing a major shift in the general conception of its physical nature: 1. Despite its very polymorphic image under the light microscope, the investigation of its ultrastructure revealed a basic structure that is common to all types of cells, tissues, and species. 2. It is neither a network, nor a system of canals, even less is it made of dense bodies, rods, granules, or a special Golgi substance. Instead, electron microscopy revealed single dictyosomes, scattered throughout the cell and showing a triple physical constitution consisting of a complex pile of flattened sacs or cisternae, big vacuoles, and small vesicles. 3. Its main physical components are membranes. Thus, whereas the traditional image corresponded to the morphological idea of form and function still dominant in the first half of the twentieth century, the electronic one furnished the visual data for the growing fields of cytochemistry and molecular biology (see also Bechtel 2006). The former was made of bodies called organelles, solid granules, networks, and canals; the latter was a complex system of compartments and membranes. The electron micrograph furnished the basis for a structural pattern that was for the first time shared by all Golgi researchers.

84

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy

The following years saw a resumption of research activity (even if much less intensive than the research on mitochondria and ribosomes, which were believed to play special roles in cell energetics and protein synthesis). Dalton and Felix’s image was compared with those obtained in different cell types and with other cell structures to establish a rough general scheme of endocellular organization. Many efforts were made to improve the preparation technique and further sharpen the electron-microscope eye. As a result, Dalton and Felix’s image was soon substituted by other, more representative ones. The very large vacuoles in particular were revealed to be too particular a feature to be universally valid. Our present conception of the Golgi apparatus is again under debate. Electron micrographs have greatly contributed to establishing it as a distinct cell organelle. Still, this visual proof may in reality be an illusion caused by interpreting momentary microscopic images as permanent states. Today, the Golgi apparatus is generally conceived not as one single well-defined body but as consisting of at least three distinct parts: the cis-Golgi (connected to the endoplasmic reticulum), the trans-Golgi, and the Golgi-vesicles. Some cytologists go even further and doubt whether the apparatus can really be called an independent cell organelle or whether it has to be considered part of another organelle, the endoplasmic reticulum, or merely as a fleeting aggregation of proteins and lipid membrane that constantly assembles and disassembles (Check 2002). If the latter conception gains more supporters, Dalton and Felix’s image will soon be judged an artefact. Thus, the electron image did not resolve all problems. Some old ones were not solved, and new ones arose, in particular the reliance on highly artificial images, the loss of space, and the loss of time.

The artefact problem Art and microscopy have at least one aspect in common: the aim to make the invisible world visible. According to Swiss artist Paul Klee, “Art does not reproduce the visible, it makes things visible.” But whereas no work of art has ever been blamed for not representing the real world, the production of distorted images is the great fear of every microscopist. Subjectivity is the liberty of the artist, objectivity the demand of the scientist, and truthfulness the obsession of the microscopist. The relationship between visibility and reality is not the exclusive domain of artists. Since antiquity, philosophers have worried about evident and intrinsic truths. For Aristotle, the senses were always truthful and, thus, direct observations must form the basis of all knowledge. The introduction of optical instruments represented a methodological and philosophical revolution because telescopic and microscopic impressions often belied impressions made by the naked eye (Mazzolini 2002). Consequently, the microscopic world is not a simple extension of the world visible to the naked eye, but a new world, a step into a new dimension. The same held true for the introduction of electron microscopy (Rasmussen 1997). A further promising gateway had been opened by introducing to cytology more or less elaborate staining techniques that had been recently developed in chemistry. The treatment with dyes and other chemical reagents brought out surprising new information about cellular and subcellular constitution. The turn of the century was in fact characterized by a proliferation of discoveries of supposed cell organelles. Yet, the application of sophisticated microscopic techniques still represented another step towards an estranged world, difficult to decipher, ambiguous, and uncertain in its reliability. The fear became even greater that one’s scientific work was being erected on the framework of optical illusions. Yet, the ongoing search for the intrinsic truths of

85

Ariane Dröscher

living matter depended on the production of artificial images. At least since the end of the nineteenth century, microscopic facts were artefacts. As had been the case in the era of light microscopy, the samples for electron microscopy required heavy chemical interventions, thus producing artefacts. Indeed, nearly every introduction of new methods or even modifications of known ones were followed by debates on the reliability of the visual facts they produced. The creation of new journals, such as the Journal of Electron Microscopy Technique, dedicated entirely to technical improvements, shows the importance given to the methodological questions. Yet, even today the chemical or physical effects of the great majority of these techniques are not entirely understood. Likewise, the techniques applied to the Golgi have not yet lost their merely empirical character. What really happens during these reactions and why is still a mystery. Still, scientists actually did (and do) not need such clarification. To continue their work, it is enough to obtain a sufficient number of repeatable images that correspond to those made with similar techniques or similar research objects and that make sufficient sense. This shows that in microscopy it is often useless to adopt criteria of falsification or objectivity. Any procedure produces a true picture and often also objective facts. These preparations can be repeated by anybody at any time and place; the main criterion for acceptance or nonacceptance is whether these (arte)facts have any significance. Thus the artefact problem continued to exist, but it was not considered to represent a severe obstacle for further research using the electron microscope. Certainly there was a gap between real nature and its electronic image. One of the most evident examples is the nonexistence of neat black borders between the individual compartments. However, in the late 1950s, other aspects of electron micrographs were revealed as obstacles to a real understanding of the cell and its life, namely the problems of space and time.

Space Electron microscopy caused a further restriction of the investigator’s view. Using light microscopy, research had focused on the entire Golgi apparatus or at least those parts that had been stained by the various microscopic techniques. Electron microscopy instead – penetrating further into the matter – concentrated the view on only one or a few dictyosomes. However, the Golgi can cover an extensive region of the cytoplasm; the extent and number of dictyosomes within one cell can range from about thirty in pea roots to several hundred in the maize root tip to more than 25,000 in the rhizoid apex of Chara foetida. Still another problem arose when it became clear that the functioning of the Golgi apparatus had a strong effect on its shape. Botanist Gordon Whaley, for example, directed a great part of the work of his research team – mainly Hilton H. Mollenhauer, James H. Leech, Joyce E. Kephart, and Marianne Dauwalder – toward the developmental systems in which cytokinesis or cell division takes place. His most important contribution to Golgi apparatus research in those years was his observation of the involvement of the dictyosomes in cell-plate formation, that is the successive construction of the membrane that will divide the two future daughter cells after mitosis (Whaley et al. 1959), a process independently noted also by Albert Frey-Wyssling. Since the 1920s there had also been heavy evidence for the involvement of the Golgi apparatus in the process of cell secretion. Electron microscopy confirmed and enforced this interpretation, revealing that the secretory vesicles evolve just from the edges of the cisternae (Whaley et al. 1960) and that the entire dictyosome undergoes considerable morphological modification with changes in cell activity: it continuously assembles and disassembles, it becomes shrunken or hypertrophied, and the volume and character of the vesicles change.

86

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy





Figure 2: An early electron micrograph of a nerve cell fixed without buffer. From: Pease, Daniel C. and Richard F. Baker. 1951. “Electron microscopy of nervous tissue”. Anatomical Record 110, 1951: 505-529, plate 1. Reprinted with permission of John Wiley & Sons, Inc.

In this way, it became evident that the capricious response of the Golgi apparatus to staining techniques and its notorious topographical and morphological changeability were not a joke of Nature, meant to drive all too-curious researchers into despair, but rather its fundamental physiological feature. At the end of the 1950s, this had repercussions on the general cell concept as well. During that time, several leading cytologists (Hodge et al. 1956; Palade 1956; Whaley et al. 1959) advanced the hypothesis that the interior of cells might largely consist of a general system of membraneous profiles, extending from the margins of the cell to the nuclear envelope and providing the cytoplasm with a tremendous amount of internal surface. Moreover, the studies made at the University of Texas at Austin on the morphogenetic events during different stages of cell division revealed a marvelous synchrony in the activity and structural changes of all, or at least a great number, of the individual dictyosomes scattered throughout the cytoplasm. In 1965, right in the middle of the era of the Central Dogma and of information theory in molecular biology, Hilton Mollenhauer advanced the hypothesis of an alternative flow of information, coordinating the activities of the individual dictyosomes to form together a functional unit, the Golgi apparatus (Mollenhauer 1965). The imaging technologies for electron microscopy, however, rely on thin sections and are thus essentially two dimensional and cannot reveal the complete three-dimensional organization of the Golgi and the structures involved in traffic to, from, and through it (Ladinsky et al. 1999). Intracellular traffic of vesicles; synchronous membrane flow between various endocellular structures; the phenomena occurring during the construction of new cell walls; and, above all, quantitative audioradiographic studies made in 1963 showing the pathway of marked particles passing from the endoplasmic reticulum, through the Golgi, through the cell lumen, and to the cell wall (Nadler 1963; Worshawsky et al. 1963; Leblond 1965) demonstrated the Golgi apparatus

87

Ariane Dröscher

to be not just a cell inclusion but part of an integrated functional system. Thus, it seemed impossible to investigate the Golgi apparatus without clarifying its specific way of functioning. And, to do so, it seemed indispensable to reconquer the space of the cell lumen and the dynamic of cell life, both definitely lost with early electron microscopy.



Figure 3: Mollenhauer and Morré’s 3-dimensional model of the Golgi apparatus. From: Mollenhauer, Hilton H. and D. James Morré. “Golgi apparatus and plant secretion”. Annual Review of Plant Physiology 17, 1966: 27-47. Fig. 1 on page 29. Reprinted with permission of the Department of Annual Review of Plant Physiology.

In 1966, thirteen years after the first electron micrograph of a dictyosome was made, Hilton Mollenhauer and James Morré of Purdue University tried to integrate the morphological data with the indirect data deduced from physiological necessities and published a three-dimensional model of the Golgi apparatus. This model, though presenting many shortcomings, explicitly attempted to incorporate the by-then-already-classical two-dimensional image. At the same time, Mollenhauer and Morré’s efforts provide evidence of the great difficulty in correlating morphological and physiological, not to mention biochemical and cytochemical, features and developing a single model that illustrates both form and function.

Time Electron microscopy propagated an image of dictyosomes neither as local concentrations of a distinct biochemical substance nor as massive bodies, but principally composed of membranes. In addition to furnishing a different picture of the physical constitution, the technique made clear that dictyosomes are not stable structures with fixed and neat contours. On the contrary, the membranes seemed to be in continuous movement. This raised new technical and imagery problems. The chemical fixation for electron microscopy takes seconds or even minutes to immobilize cellular processes, and different cellular components are fixed at different rates. When one is attempting to characterize a highly dynamic structure such as the Golgi, this limitation is a severe one. Thus the researcher must rely not only on visual data, but must also produce a mental scheme of all physiological processes taking place contemporaneously in the cell. Using traditional electron microscopy, it is impossible to observe life in action. Thus, a general difficulty of the pictorial representation of the Golgi apparatus became evident: any morphological representation probably needs to match as much as possible the (supposed) natural appearance, whereas the illustrations of its functions are probably better suited to more abstract and mental models. And indeed, though form and function are intimately connected in

88

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy

Golgi apparatus research, in the following years, the representation of the two aspects was separated.

Conclusion Despite all hopes and efforts, microscopy does not furnish self-evident facts. Visual information has to be interpreted and integrated as part of a virtual and dynamic picture of the living and active cell. The significance of any microscopic image is not deducible by itself alone. Electron microscopy produced images that profoundly changed Golgi apparatus research. Subsequently, however, it was precisely the electron micrographs that compelled cytologists to return to a higher visual level, to broaden their view and not to focus exclusively on the single structure itself, but to develop a dynamic vision that included the ongoing processes, especially membrane traffic, of the whole cell. This also meant leaving behind the classical concept of organelles as cell inclusions. It is possible that the pictorial visualization of the Golgi apparatus, in this respect, was a forerunner of the general change in the visual conception of the cell. Still another interesting aspect is the ongoing discussion about the exact nature of the Golgi apparatus. What might have driven most Golgi researchers into dispair, may, in the end, be a positive feature because if the degree of difficulty in defining a research object is an indicator of the liveliness of an investigative enterprise, then Golgi apparatus research is still today one of the most vivid ones.

Discussion Serpente: Why did you chose the picture of 1954 as the paradigmatic change for what is a fact and what is an artefact? You make a split before and after. Why that picture? They all look just fine; is it empirically? How much empiricism is behind the image, or is your own interpretation in favor of a particular image, and a non-particular article? Because I make a similar divide between optical microscopy and electron microscopy by asking: does the Golgi apparatus exist, yes or no? Dröscher: I do not mean that electron micrographs were considered as more reliable than optical ones. The point was that light microscopy produced so many different images, black dots, rods, networks and more, and, moreover, one could not exactly see what the real nature of these structures was: a substance or pleomorphic bodies or dynamic canals. Dalton and Felix’s was the very first electron micrograph that showed a definite structure. Dalton, as (some) other cytologists, was obviously convinced before that it really existed, but this conviction seemed to be based more on an intuitive belief than on objective facts. This micrograph, for the first time, showed a Golgi apparatus with a clear and definite structure. In this sense it was immediately shown in the Symposia of the Royal Society as proof for all participants that it was not an artefact. Burian: I carefully stay away from questions about function. Did they play any role in decisions made what to say and what not? I am truly ignorant here, I don’t know the story, what the history is here. But what do you think was the basic idea of function here? Dröscher: According to Bechtel, once you can include a structure in a functional model, people are more convinced of the existence of this structure. I am wondering whether this interesting idea is valid for Golgi research, too, because the researchers continued not to know exactly what the apparatus is good for. There already were many hypotheses about its function in light microscopy, and many bright hypotheses. But, I think, in the very first years after 1954 the work on the Golgi apparatus was just visual or morphological, only by and by it was included into a functional picture.

89

Ariane Dröscher

Perini: Could you say a little bit more about the artefactual status of the micrograph? I think it is not clear what you getting at when you describe it that way. Is it because of the kind of detection technique that it is, or is it given a visualization, but not through light, in the visible spectrum, or is it something about the technique? Dröscher: Through all the years of light microscopy, the heavy metal impregnation used to make the Golgi apparatus visible was charged with producing only artefacts. In the era of electron microscopy, instead, especially osmium impregnation was considered a perfect technique for producing reliable pictures of ultrastructure because osmium deposits on the membranes without interfering with them. On the other hand, these pictures still are artefactual because there is nothing like black lines inside the cell.

90

Cellular Dimensions and Cell Dynamics, or how to Capture Time and Space in the Era of Electron Microscopy

Bibliography Bechtel, William. 2006. Discovering Cell Mechanisms: The Creation of Modern Cell Biology. Cambridge, UK: Cambridge University Press. Check, Erika. 2002. “Will the real Golgi please stand up”. Nature 416: 780-781. Dalton, Albert J. and Marie D. Felix. 1954. “Cytologic and cytochemical characteristics of the Golgi substance of epithelial cells of the epididymis – in situ, in homogenates and after isolation”. Am. J. Anat. 94: 171-207. Gatenby, J. Brontë. 1955. “The Golgi apparatus”. J. Roy. Micr. Soc. 73: 134-161. Hodge, A.J., J.D. McLean and F.V. Mercer. 1956. “A possible mechanism for the morphogenesis of lamellar systems in plant cells”. J. Biophys. Biochem. Cyt. 2: 597-608. Ladinsky, Mark S., David N. Mastronarde, J. Richard MacIntosh, Kathryn E. Howell, and L. Andrew Staehlin. 1999. “Golgi structure in three dimensions: functional insights from the normal rat kidney cell”. J. Cell Biol. 144: 1135-1149. Leblond, Charles P. 1965. “The Time Dimension in Histology”. Am. J. Anat. 116: 1-28 (27). Mazzolini, Renato G. 2002. “I nuovi mondi della microscopia”. In: Storia della scienza, vol. V: La rivoluzione scientifica. Roma: Istituto della Enciclopedia Italiana. pp. 630-640. Mollenhauer, Hilton H. 1965. “Transition Forms of Golgi Apparatus Secretion Vesicles”. J. Ultrastruct. Res. 12: 439-446. Mollenhauer, Hilton H. and D. James Morré. 1966. “Golgi apparatus and plant secretion”. Ann. Rev. Plant Physiol. 17: 27-47. Nadler, N.J. 1963. “Calculations of the turnover times of proteins in each region of the acinar cells of the pancreas’’. J. Cell Biol. 6: 24-28. Palade, George E. 1952. “A study of fixation for electron microscopy”. J. Exp. Med. 95: 285-298. Palade, George E. 1956. “The endoplasmic reticulum’’. J. Biophys. Biochem. Cyt. 2, Suppl.: 85-97. Pease, Daniel C. and Richard F. Baker. 1951. “Electron microscopy of nervous tissue”. Anat. Rec. 110: 505-529. Rasmussen, Nicolas. 1997. Picture Control: The Electron Microscope and the Transformation of Biology in America, 1940-1960. Stanford: Stanford University Press. Whaley, W. Gordon, Joyce E. Kephart, and Hilton H. Mollenhauer, 1959. “Developmental changes in the Golgi-apparatus of maize root cells”. Am. J. Bot. 46: 743-751. Whaley, W. Gordon, Hilton H. Mollenhauer, and James H. Leech. 1960. “The ultrastructure of the meristematic cell”. Am. J. Bot. 47: 401-449. Worshawsky, H., Charles P. Leblond and B. Droz. 1963. “Synthesis and migration of proteins in the cells of the exocrine pancreas as revealed by specific activity determination from radioautographs”. J. Cell Biol. 16: 1-23.

91

II Instrumentaria Tools Matter

Agnes Arber: The Mind and the Eye Maura C. Flannery

Agnes Arber (1879-1960) was a distinguished British plant morphologist of the first half of the 20th century. She is relevant at this meeting because of her additional talents as an artist, historian, and philosopher. She is one of the few people who could hold her own in all the discussions at this workshop. Educated at Cambridge University and the University of London, she wrote three major works (Arber 1920, 1925, 1934) and over sixty research articles on plant morphology, and she was only the third woman elected to the Royal Society, receiving that honor in 1946. Married to the Cambridge paleobotanist, Newell Arber (1870-1918), she remained in Cambridge after his death, raising her daughter Muriel and receiving various fellowships, but never having an official appointment.



Figure 1: Photo of Agnes Arber, a 1911 photograph taken by E. A. Newell Arber. From Schmid 2001, fig. 4, p. 1109 (copyright from Annals of Botany , Oxford University Press).

The very fact that Arber achieved scientific recognition despite the lack of an academic position speaks highly of her research and also speaks to the place of women in British science in the first half of the 20th century. While she was in school and again after she obtained her doctorate – up to the time of her marriage – Arber worked in the private laboratory of another woman botanist, Ethel Sargant (1863-1918). This laboratory was in Sargant’s home and there Arber took up research on grasses and other monocotyledons which was to be her field of study. At Cambridge, Arber worked at the Balfour Laboratory, a facility for women researchers and science students, until it closed in 1927. Then her request for space in Cambridge’s Botany Department was turned down, and so, borrowing a microscope and microtome from Newnham College which had run the Balfour, she set up a laboratory in a tiny room of her home, thus following in Sargant’s footsteps (Packer 1997).

95

Maura C. Flannery

History and Philosophy Arber was also very interested in the history and philosophy of science. Her first book was a history of early printed herbals that has become a classic and is still in print today (Arber 1912). In his Royal Society memorial to Arber, H. Hamshaw Thomas writes that Arber developed an interest in herbals as a teenager, when her father brought home one which he had been asked to appraise, and that her fascination with Goethe’s ideas dates from the same time (Thomas 1960). In 1946, she published a translation of Goethe’s Attempt to Interpret the Metamorphosis of Plants with an extended introduction and commentary. Through the years, she wrote pieces on figures in botanical history such as Nehemiah Grew (Robertson 1906) and John Ray (Arber 1943) for Isis and other publications. Her status in the history of science community is suggested by the fact that Charles Singer dedicated A Short History of Biology to her (Singer 1931), and she had an article on Guy de la Brosse in the first volume of Isis (Arber 1913). Her last article was also published in Isis, posthumously (Arber 1960). Arber’s two major works in the philosophy of biology are The Natural Philosophy of Plant Form (Arber 1950), which she described as a metaphysical view of plant morphology, and The Mind and the Eye (Arber 1954) on the philosophy of biology. The latter is the most accessible of her books and provides an interesting introduction to her ideas. Arber begins her book by outlining what she sees as the steps in biological inquiry. The first three are to find a question to explore, to investigate it, and to interpret the results of the investigation of the observations or experiments involved. Next comes testing the validity of this interpretation, followed by communicating the work to the scientific community. This is a relatively standard rendition of scientific inquiry, but Arber then adds one more step, that of reflecting on the research and its relation to larger issues in science and even in philosophy. Arber sees this as something a researcher might do toward the end of their career, just as she had done in publishing The Natural Philosophy of Plant Form when she was over 70 years old. She considers philosophical reflection important work because only when the larger implications of research are understood can its real value be appreciated and the scientific endeavor truly enriched. In Natural Philosophy Arber traces the history of ideas about plant form from the time of Aristotle. She gives particular attention to Goethe’s idea of the leaf as the basic form in plants to which all other structures are related. She then argues for a different fundamental form: the leaf as a partialshoot. After providing a defense of this concept, she ends with a philosophical interpretation of plant morphology. She argues for a special place for morphology as different from, but equal in importance to, more analytic modes of inquiry such as the experimental methods used in biochemistry and cell biology.

Art Arber was also an artist, and this skill influenced her approach to research. She received early art training from her father, Henry Robertson, a landscape painter by profession. Botanically accurate watercolors of plants done in her teens attest to her skill, and she did almost all the drawings for her scientific papers and books; several of the latter have well over a hundred figures. Nothing leads to close observation better than drawing does. I know this from personal experience because for the last few years I’ve been taking botanical illustration courses. I would go so far as saying that its difficult to really see something without at least attempting to draw it. Having to commit lines to paper, having to choose just the right colors, having to create a threedimensional form on a two-dimensional surface, requires not only manual skill but a great deal of thought and looking, over and over again, repeatedly testing what appears on the paper against the object, and in the process, getting to know that object, in a new and deeper way.

96

Agnes Arber: The Mind and the Eye

Biologists are the most visual of the scientists and in the heyday of botany, zoology, and cell biology, it was common for biologists to document their observations in drawings. Arber obviously did this, and she went a step further by turning her observational drawings into finished illustrations for her publications. This was probably primarily because she couldn’t afford to hire a professional artist. Her husband died in 1918 when their daughter was only five years old, and in the first half of the 20th century women were not likely to get paid positions at Cambridge University. But she made a virtue of necessity. She writes of how she found the quiet time given over to manual laboratory tasks as useful in mulling over research questions, and she considered drawing time in the same light. In The Mind and the Eye (Arber 1954), she writes that there is so close an association between fingers and brain, that by handing over either the technical or the interpretive side of the research to another worker cannot but mean serious loss of integration (Arber 1954, p. 12). From my limited drawing experience, I can appreciate both the advantages to doing ones own drawing and the huge time commitment in creating publishable illustrations. This is much different from just producing drawings that others then translate into finished works.



Figure 2: Illustration of corms from Arber’s Monocotyledons Cambridge University Press 1925, p. 46.

Arber also made the best use possible of the limited tools she had available – the borrowed microscope and microtome for slicing thin sections of plant material for viewing under the microscope. William Derek Clayton argues that her research was bounded by the relatively crude equipment to which she had access (Clayton 1965), and there is validity to this observation. Her publications in morphology are filled with images of slices through plant structures to illustrate three-dimensional forms. She is documenting her conclusions with the evidence she has amassed, and since her work is so visual, her evidence is heavily visual as well. In some cases, the pages of illustration outnumber those of text.

97

Maura C. Flannery

In a brief article entitled Ways of Studying Nature, the artist Paul Klee describes not only looking at external structure, but looking within as well, both literally and figuratively, in ways that are reminiscent of Arber’s approach. He writes: Man dissects the thing and visualizes its inside with the help of plane sections; the character of the object is built up according to the number and kind of sections that are needed. (Klee 1991, p. 8)

But Klee is also interested in another way of getting to the inner life of an organism. In the same essay he notes that the result of such observations and dissections can be the ability to draw inferences about the inner object from the optical exterior, and what is more, intuitive inferences (Klee 1991, p. 9). Klee was painting at the same time that Arber was drawing. I am not claiming that there is any connection between them, but rather that the words of an artist can be helpful in illuminating what is going on in the work of a biologist/artist like Arber and that we should attend to Arber’s own words on the subject because they give an insight into how the visual work of biology is done even to this day.

The Mind’s Eye Arber writes that artistic expression offers a mode of translation of sense data into thought, without subjecting them to the narrowing influence of an inadequate verbal framework; the verb, to illustrate, retains, in this sense, something of its ancient meaning to illuminate (Arber 1954, p. 121). In her introduction to Goethe’s Metamorphosis, Arber writes of Goethe that we know that his actual visual impressions were peculiarly intense, and greatly influenced his mode of thought (Arber 1945, p. 84). I think the same can be said of Arber. She goes on: This way of apprehending nature might be called thinking with the mind’s eye; it lies halfway between sensuous perceptions reached through bodily sight, and the abstract conceptions of the intellect (Arber 1945, p. 84).

And in Natural Philosophy, she notes: The morphologist has to aim at what the portrait-painter achieves when he adds intellectual insight to mastery of technique, as seen through, and expressed in, the external lineaments (Arber 1950, p. 209).

She went on to argue that there is, indeed, a certain correlation between artistic power and morphological insight (Arber 1950, p. 210). Since Arber’s research was on plant form, there was no better way to know form deeply than by drawing it from close observation. Arber contended that the passage from sensory stimuli impressed upon the eye by a given form, to the imaginative realization of that form in the mind, is a bridge between the senses and the intellect (Arber 1950, p. 211). The imaginative realization of form in the mind is at the heart of her thinking, of her wedding of the objective and subjective. Arber’s art not only supported her scientific work, but her historical and philosophical work as well. In fact, all four of her interests were highly integrated. Her last book, The Manifold and the One (Arber 1957), was primarily a work of metaphysics, and in the introduction she writes that the relationship of the one and the many is a concept that had fascinated her since her youth. It makes sense that this idea would appeal to a botanist: plant species are great examples of variations on a theme. Many flowers have sepals, petals, stamens, and pistils, but these structures vary widely in number and form from species to species. And there is an aesthetic aspect to these variations on a theme, as in music, an attractive dance of balance between unity and diversity.

98

Agnes Arber: The Mind and the Eye

But as is suggested by the materials she preserved and which are now at the Hunt Institute for Botanical Documentation in Pittsburgh, Arber probably thought that it would be her contribution to the history of botany and botanical imagery that would be most lasting. After her death, her daughter sent her papers to the Hunt. There I found many wonderful treasures, but also learned how much is not there. Being an incurable romantic I was thrilled to discover that she kept her notes from the lectures of Newell Arber, one of her professors at Cambridge and later her husband. There is also a series of letters documenting her almost 30-year correspondence with D’Arcy Thompson. (She began the exchange in 1917 pointing out several items in his book On Growth and Form with which she took issue.) But I was disappointed to find almost no notes from her morphological research. All that seems left of this work are the papers and books she published. There are also few notes that directly concern her works of philosophy. Yet her notes for the book on herbals are here, including the notes for the revised version she published in 1938 (Arber 1938). In the letter which accompanied Arber’s papers to the Hunt, Muriel wrote that a few years before she died her mother had destroyed many of her papers as being of no use. Its probable that Arber’s scientific notes were among these. Yet there are several binders with notes on herbals and boxes of index cards cataloguing the primary and secondary literature she used for both editions of Herbals. There are even copies of the illustrations used in the revised edition. It seems obvious that Arber thought the only research which might be of interest and value to future generations of scholars was her work on botanical illustration. It was history and the visual which trumped all. I haven’t found anything she wrote to this effect, but the circumstantial evidence seems clear. In Herbals, her interest in art, botany, history, and philosophy come together in the most forceful way; here she manages to integrate her many interests most effectively. Since she continued to amass information on this theme her entire life, it was her greatest passion.

Molecular Form I would like to disagree with Arber and argue that much of her writing has lasting value, and that her philosophical work is particularly relevant today to the field of macromolecular form. Evelyn Fox Keller notes that the new technologies available for imaging cells and macromolecules have brought about the revival of a tradition that had been long-standing among students of biology – a tradition in which seeing is considered as both the most reliable source of knowledge and the indispensable basis of understanding (Keller 2002, p. 201). If this is the case, then Arber’s views have something to say to those who are working in what she termed the physico-chemical sciences, particularly with protein structure where the diversity of proteins is reminiscent of the diversity in plant species and structures. The study of protein structure is moving ahead so fast on so many different fronts – predicting structure from amino acid sequences, developing new visualization technologies, and correlating structure with function – that there is little time for Arber’s sixth step in research. However, developing such a perspective could be useful. At the very end of Natural Philosophy, Arber argues for a synthesis of the physico-chemical and morphological perspectives which can be reached through combining conceptual reason, and thought which is visual and tactual (Arber 1950, p. 211). In other words, analysis has to be combined with sensory perception. The tactile doesn’t seem to apply at the molecular level, but as the biochemist Jacqueline Barton has said of the DNA model she keeps on her desk: it is not just there to explain her research to others; she needs to look at and touch it in thinking about her work (Amatniek 1986). Two proteins, like two plants, may have similar structures, and while there are computer programs that will do what Arber sees the mind as doing – finding points of similarity and underlying unity – there is still need for the mind to analyze the significance of such similarities.

99

Maura C. Flannery

A brief look at the history of protein imaging indicates why this is the case. The creation of a representation of a protein structure is a very different process today than it was when John Kendrew was working on the structure of myoglobin, or Max Perutz on hemoglobin. Then it was a matter of figuring out how to visually represent atomic coordinates. Martin Kemp writes of Irving Geis’s model of the myoglobin model that it is difficult to credit that it was not done by computer (Kemp 2000). Indeed there is a case for saying that our current computer graphics are basically providing a series of powerful, dynamic, and convenient glosses on the graphic modes invented by Geis and his fellow pioneers of handmade representation (Kemp 2000, p. 119). Geis took visual information from many layers of two-dimensional Lucite density maps that Kendrew and his colleagues had created from the data on atomic coordinates, and then using another set of visual information, that on how models of simpler molecules had been constructed, he tried to present the data in a way that was visually comprehensible and gave a sense of the threedimensionality of the molecule. What Geis had to do was very much what Arber was describing in reconstructing forms from observations on thin slices of plant tissue. Todays protein chemists also work with artists, but now it is with artists expert in computer graphics. In many cases, the chemists and the artists don’t collaborate directly; the chemists use the programs the graphic artists have created. The middle steps of going from data to image are now invisible. There are also computer programs to compare sequences and others to compare protein shapes. The kind of mental analyses that Arber did in trying to find parallels between plant forms in different species, or between different structures in one plant, are also carried out for protein structures, but the work is more likely to be done by computer than by the human mind. This is due to the complexity of the forms – the regularity (symmetry for example) that helps to make many plant forms understandable is missing from many protein forms. The move to computers is also the result of a need for great accuracy in the comparisons. In dealing with molecular shape, very small differences in form can make large differences in how a protein interacts with other molecules. These small differences must be quantified, something that a computer can do but would be impossible for the human mind to do with any accuracy. Still, it seems important for protein chemists to be more aware of the kinds of mental processes computer programs are performing in their stead. Eugene Ferguson has argued that as computers became more ubiquitous in the field of engineering, engineers became less visually literate, less able to interpret complex diagrams (Ferguson 1992). The same argument can also be made for protein structural studies. The issues which Arber dealt with – parallels between forms, getting at essential similarities and differences – are relevant to the work of protein chemists, particularly in treating the dynamism of proteins. Just as Arber had to deal with developmental change, so protein chemists must deal with the shape change that occurs as a protein functions. They have to mentally process not only the different shapes of a protein but the relationship between them, much as Arber processed information on plant development. Again, the computer is there to help, but to make sense of the changing images on a computer screen requires a great deal of visual processing in the brain.

Present-Day Plant Science At the 16th International Congress Botanical Congress in 1999, there was a symposium on the relationship between Arber’s work and new explanatory models for vascular plant development; these papers were later published in the Annals of Botany (December 2001). As Bruce Kirchoff, one of the organizers of the symposium notes, systematics and molecular biology are creating huge amounts of new data about plants, but these data are only as useful as the models used to

100

Agnes Arber: The Mind and the Eye

explain them (Kirchoff 2001a). What is lacking is Arber’s sixth step: taking the long view and examining the philosophical underpinnings of this work, finding ways to see the unity and meaning behind this information. Kirchoff also argues that while present-day morphologists do not usually take Arber’s holistic approach, there is a greater shift to the use of visual information as in her work (Kirchoff 2001b). This shift allows systematists to capture more information, including some of the context in which the character occurs (Kirchoff 2001a, p. 1203), thus indirectly leading to a more holistic viewpoint. Kirchoff argues for visual databases in botany to avoid the narrowing of information which occurs when visual data is translated into words. He sees this as in keeping with Arber’s drive for a better way to see what is already visible to draw our attention to the interrelation among a number of phenomena to help us to see the plant with fresh eyes, and to speak about the results of this seeing, and to place results in the context of botanical thought (Kirchoff 2001a, p. 1204). This may be Arber’s most important contribution to the future of biology: to focus our attention on the importance of the visual. Philip Ritterbush has said that biology is the most visual of the sciences, but unfortunately biologists don’t always behave as if this were the case (Ritterbush 1968). Their work is so involved with the visual that they fail to notice the complexities and difficulties of observation and representation. Arber did not shy away from these issues, and perhaps in our effort to deal with them more forthrightly, her work might be a good place to begin.



Figure 3: Illustration of Ruscus hypophyllum, butcher’s broom, from Arber’s Monocotyledons Cambridge University Press 1925, p. 143.

Discussion Fiorentini: I think you made a very important point, the connection of mind and eye. I am wondering about what we call in art history the comparing glance (vergleichendes Sehen) which is part of the curriculum of an art historian to learn to see things in comparison and to recognize them according to your interest and sorting them out. And I don’t know if this kind of comparing

101

Maura C. Flannery

glance that you can’t train could be a specificity of the way of seeing of someone who is trained as an artist and then makes observations in the scientific field. Flannery: I think so, and this brings up two things which I should have said. One is that at the time biologists were working comparatively, not just looking at a plant, but comparing one species with another. So I think there is something very relevant in what you are saying. In her last work, The Manifold and the One, Arber was doing something very different; it was a work of mysticism, of contemplation. However, I think it still has to do with the idea of unity, as in plant structure, of a relationship between different forms. Variation here is a theme, very much as in styles in art. Secondly, I was supposed to be saying something about how instruments are involved in Arber’s work. W. D. Clayton wrote in an introduction to a reprint of The Gramineae that her work was very much limited by her use of the microtome. The microtome cuts specimens into thin slices, and her illustrations are full of drawings of thin sections of plant material seen with light microscopy. All this work is still about comparison, in that she is looking at similarities and differences in structures among species and within species. My point is that, as far as Clayton’s comment is concerned, all researchers are constrained by instrumentation at some point.

102

Agnes Arber: The Mind and the Eye

Bibliography Amatniek, J. C. 1986. “Card-carrying inorganic chemist explores a biological world”. The Chronicle of Higher Education, 5 February, p. 7. Arber, Agnes R. 1912. Herbals: Their Origin and Evolution, a Chapter in the History of Botany, 1470-1670. Cambridge: Cambridge University Press. Arber, Agnes R. 1913. “The botanical philosophy of Guy de la Brosse: A study in seventeenthcentury thought”. Isis 1(3): 359-369. Arber, Agnes R. 1920. Water plants: A study of aquatic angiosperms. Cambridge: Cambridge University Press. Arber, Agnes R. 1925. Monocotyledons: A morphological study. Cambridge: Cambridge University Press. Arber, Agnes R. 1934. The gramineae: A study of cereal, bamboo, and grass. Cambridge: Cambridge University Press. Arber, Agnes R. 1938. Herbals: Their origin and evolution, a chapter in the history of botany, 1470-1670 (A new ed., rewritten and enl. ed.). Cambridge: Cambridge University Press. Arber, Agnes R. 1943. “A seventeenth-century naturalist: John Ray”. Isis 34: 319-324. Arber, Agnes R. 1946. “Goethe’s botany”. In: Chronica Botanica, 10, 63-126. Arber, Agnes R. 1950. The natural philosophy of plant form. Cambridge: Cambridge University Press. Arber, Agnes R. 1954. The mind and the eye: A study of the biologist’s standpoint. Cambridge: Cambridge University Press. Arber, Agnes R. 1957. The manifold and the one. London: J. Murray. Arber, Agnes R. 1960. “Robert Sharrock (1630-1684): A precursor of Nehemiah Grew (1641-1712) and an exponent of ‘natural law’ in the plant world”. Isis, 51(1), 3-8. Clayton, William. D. 1965. “Introduction to Arber’s ‘the gramineae’”. In Agnes Arber (ed.) The gramineae: A study of cereal, bamboo, and grass (reprint of 1934 original edition ed., pp. ixxxii). New York: Cramer-Weinheim. Ferguson, Eugene S. 1992. Engineering and the mind’s eye. Cambridge, MA: MIT Press. Keller, Evelyn F. 2002. Making sense of life: Explaining biological development with models, metaphors, and machines. Cambridge, MA: Harvard University Press. Kemp, Martin. 2000. Visualizations. Berkeley, CA: University of California Press. Kirchoff, Bruce K. 2001a. “Character description in phylogenetic analysis: Insights from Agnes Arber’s concept of the plant”. Annals of Botany 88: 1203-1214. Kirchoff, Bruce K. 2001b. “From Agnes Arber to new explanatory models for vascular plant development”. Annals of Botany 88: 1103-1104.

103

Maura C. Flannery

Klee, Paul. 1991. “Ways of studying nature”. In: Guse, Ernst-Gerhard (ed.). Paul Klee: Dialogue with nature. Munich: Prestel, p. 8-9. Packer, Kathryn. 1997. “A laboratory of one’s own: The life and works of Agnes Arber, F. R. S. (1879-1960)”. Notes and Records of the Royal Society of London 51: 87-104. Ritterbush, P. 1968. “The biological muse”. Natural History 77(9): 26-31. Robertson, Agnes. 1906. “Nehemiah Grew and the study of plant anatomy”. Science Progress 1: 150-158. Schmid, Rudolf (2001). “Agnes Arber, née Robertson (1879-1960): Fragments of her Life, Including Her Place in Biology and Women’s Studies“. Annals of Botany 88: 1105-1128. Singer, Charles. 1931. A short history of biology. Oxford, UK: Oxford University Press. Thomas, H. Hamshaw. 1960. “Agnes Arber 1879-1960”. Bibliographical Memoirs of the Fellows of the Royal Society 6: 1-11.

104

In Three Small Dimensions: X-Ray Microtomography as a Way of Seeing Brian D. Metscher

Another example for CT-scan technology will be put forward by Brian Metscher. He will examine direct three-dimensional imaging of small biological samples which now can achieve resolution down to the size of individual cells. As the conceptual offspring of labor-intensive methods of reconstructing sectioned samples, micro-CT offers a solution to some technical problems of 3-D imaging of embryos and other small soft-tissue specimens. Ultimately, we will be able to register morphological and molecular information in a single volumetric image. But imaging is not seeing: imaging is technical and computational, while seeing is perceptual and cognitive. This technology for direct microscopic 3-D imaging makes small biological structural systems visible to us, but we must also learn how to visualize 3-D structures and relationships in our own minds, and how to incorporate them into our theories and models.





Figure 1: 3D reconstruction from serial sections, see http://w w w.3dhistech.com /en /node /mirax-3, access from August, 2009.

For further information see Metscher, Brian D. 2009. “MicroCT for Developmental Biology: A Versatile Tool for High-Contrast 3D Imaging at Histological Resolution”. Developmental Dynamics. 238: 632-640.

105

Brian D. Metscher



Figure 2: Wax plate model of embryonic human GI tract (from Thyng, F. W. 1914. The anatomy of a 17.8 mm. human embryo. American Journal of Anatomy 17(1): 31-112).

Discussion of Metscher’s talk Johns Schloegel: Could you talk a little bit more about how you went from the tomography images to the digital images? Is it you, or just your work? Could you who knows developmental biology tell me how a developmental biologist does that work? What is the “developmental” in it, or do you do that work with biochemistry and bring development and biochemistry together? Metscher: Yes, no, yes, yes. So the way it works is the computer stores a digital image here, and this, and this one, and this one, and it stores all of those images (points to the images). It stores each image of a million pixels, and then it does a certain computation. It boils down to a couple of Fourier transforms. I doesn’t compute it, I don’t do that, nor did I program it. So, I push the button, and something comes out. The danger of not understanding what it is doing is that you don’t really know what a point means, just like using a statistics program ‘look I have a number that has a π-value, it must be important’ – but you have no idea what to put into it. But I don’t have to do the calculation myself. As we were actually making the image I did all the work. Yes, this can be useful, that has nothing to do with developmental biology actually. I have some background in engineering.

106

In Three Small Dimensions: X-Ray Microtomography as a Way of Seeing

A

B

C



Figure 3: A. Chick embryo, ca. 2 days, x-ray projection images; B. chick embryo, ca. 2 days, rendered x-ray volume image; C. chick embryo, ca. 2 days, rendered x-ray volume image and two example slices; copyright by Brian Metscher.

Dröscher: As you know computers becoming more and more important for seeing, but I’m always wondering how much the computer interferes with the outcome of the image. That’s why I’d like to ask how much do you think it is an objective image you’re presenting, and how much is it a vector image?

107

Brian D. Metscher

Metscher: Yes, some of what you are seeing is an artefact, it’s white noise. Of course, other issues are what is a natural image, or, can you see with the microscope, what can you see through the glasses? It is pretty relative. I regard computer imaging – any digital image and microphotography – as a tool, as substitutes for my poor delivery of the drawing. And I have to think of the image that way as if I had drawn it myself. And I think it is important for a scientist to remember because I am responsible for it. It’s easy for me to say “it’s a computer image”, it all looks very real, but I’m not safe. So, anything that I say about it is just as if I had made the drawing, just like something from the turn of the 1890s. Müller: Maybe you could turn up the light and show the bryozoan again, because it is really a very neat construction, and we didn’t really see it. Metscher: This is actually a fish nose. I’ll show the bryozoan in a second, and you’ll see why it is easy to confuse them in the dark. As the nostril has two half-drawings, and this rosetta-lamella that is the sensory of the feeling, much like your sensations. Let me isolate that, it’s a big fish, I cut a piece of it, and put it into flashlight and show you the film. This is a bryozoan, which is a little freshwater animal. This is about half a millimeter. The spectrals are all crowded in the tool that hold it. Did I finish that last question? Burian: I want to challenge you a little bit about that seeing. It took too short a time. You don’t have any morphological interpretation. But when we see a normal image, it is an enormous process in our head, which we understand less than we understand that computer you push the button on. We see through all of that apparatus without knowing what is going on inside, and what you do is to adjust the image for density, or sharpness so that you can fully see certain features. So you don’t have any interpretation in the morphological sense what you are seeing. Metscher: Yes, that’s it. I was a little bit glibbery about that. But, the seeing is as your seeing, and even this image I adjusted and improved until it looked as I wanted it to look, as it is. Sure, I see what I want to see. Burian: So, the foundational interpretation of the image is something we have to truly see, perhaps go wild for it. Metscher: Right, you cannot use a camera – a camera, not a digital camera, but a real camera, you know, adjust lens and focus, look to the light and all that stuff. You can’t do it without seeing. But once you have got a picture, you can find it. It’s much more like that. But to make this image you don’t have to go through all of that yourself. [background voice]: Just a small comment, also to this point. Although there is not a scientific interpretation happening, again these images are heavily produced, and have layers, and all of these levels add to the interpretations already made. Metscher: That’s why I like that tool. That is my way of drawing. [background voice]: And my question was, can you actually get volumetric data, like surface mesh data from these images to make physical objects? Metscher: Yes, once it is a digital work and data. [background voice]: And are you doing experiments with that? Metscher: Actually, not for producing physical objects, because I started with an object. I don’t need to do that. But with making things like measuring volumes and surface areas. For example, that convoluted nose started out much less convoluted. So I want to study how it starts from this

108

In Three Small Dimensions: X-Ray Microtomography as a Way of Seeing

and folds out into this, and this and this. And I wanted to do that when I can do the computation. Müller: If I can add to this. There exists a computer controlled machine that can, so to speak, materially print out every layer you see on the computer screen, so there is no problem going from the virtual sections back to a physical object. [background voice]: And this process, that is stereolithography? Müller: Yes, stereolithography and many other tools, you know, carving out, or exact plastics building up. So, there are many ways to get actually to physical objects. You could also enlarge them in size, and you have from this minute specimen a huge sculpture. Metscher: You do everything you want. So, talk to me. de Chadarevian: Can you tell us a little bit more how this imaging, if it is not 3D, or computer science, relates to the research work you are doing in EvoDevo, or anything else? You know, somehow it looks like rather traditional work of microscopy. Metscher: Yes, indeed it is. de Chadarevian: How does this relate to experimental work? Müller: I think, that will be much better answered by the next talk. And it is applying to the same question. Metscher: In a way, it is a substitute for my old stereomicroscope where you just looked. Science starts with real looking at things, and scientists just want to play with the tools. Sgaier: Can you also visualize gene expression data with that? Metscher: Yes, do you have some? Sgaier: For example, in situ embryos ... Metscher: Eventually, I want to be able to image them directly. But the procedure isn’t done by the machine. They are 3D images. Sgaier: That was what I thought of. So, right now you can’t do that? Metscher: I can’t do that. Müller: The overlay ... Sgaier: I didn’t mean the overlay, taking the stained embryo, having an image with the gene expression data and the 3D. Metscher: Not with micro-CT, as yet, but it‘s surely easy with the technique James is going to present.

109

Optical Projection Tomography: Revealing the Visible James Sharpe

Instead of scaling down, James Sharpe opts for the reverse, i.e., to fit the whole into the picture. For a common direction for technical developments in microscopy has been to focus on smaller and smaller details of biological systems: The first question often asked about a given technique is, What is its resolution? However, the increasing recognition that biological specimens should also be studied in a more holistic way – as a complete system – means that a new goal for imaging is becoming recognised. Rather than always trying to dissect out as many small components as possible, we now realise the need for a better overview of the whole system. Optical projection tomography (OPT) is one new technology suitable for whole-organ and whole-embryo imaging. Because it does not focus on the common goal of revealing invisibly-small components of specimens, it could be described instead as performing the slightly ironical task of revealing the visible – providing new perspectives on specimens that are in fact easily visible to the naked eye.



Figure 1: OPT scan of a wildtype 10-day-old mouse embryo. The embryo was whole-mount stained for the expression patterns of two proteins using fluorescently labelled antibodies. In blue is the signal detected for antibodies against the HNF3b protein, and in green is the signal for neurofilament a protein that is expressed in nerve cells. The red signal represents non-specific autofluorescence from the tissue, which highlights the overall anatomy of the embryo. This same data are displayed in two forms: the upper figures are a virtual section through the reconstruction, and the lower figures are the 3D surface model of the same embryo. [For further information see Sharpe, James 2003. Optical projection tomography as a new tool for studying embryo anatomy. Journal of Anatomy 202: 175-281]; copyright James Sharpe.

111

James Sharpe

Discussion of Sharpe’s talk de Chadarevian: I have three questions. One is, you said you don’t need words, but you did use some words, didn’t you? So I’d like to push you a little bit more to explain why you are saying that images are satisfying? And this is just the first question. Sharpe: The answer to that is simple, because I am talking. On the slides I just do prefer the image. Normal science does not talk. There are just lists of words. de Chadarevian: Ok, now the second question is that I do feel it comes back to my other question. It seems to me that there is a movement back to a sort of just looking through a microscope. Could you comment a little bit on that? There is still the molecular biology, so how does that come into that? Sharpe: Yes, that is a very good point. I usually talk about this point when I’ve got time. What we are saying in a way is, we’re using this to look at embryos, normal embryos, not even mutants. It is leaving us to a new way, or many ways of descriptive embryology. And we have found a number of things of molecular matter and reality that have never been described before. So what we are doing, we use a highly digital tool and go back to a very traditional thing which is to describe morphology, morphometrics, shapes and relations, stuff like that. And we published some of that stuff. de Chadarevian: And what does it mean for the molecular side? Is it like that two traditions going parallel, for the molecular mechanism and a new morphology? Sharpe: In my view, it is quite simple that the molecular view has to a large extent eclipsed the morphological view for a long time. And right now we are going straight back into synthesizing the two things back together. Because it’s absolutely clear you can’t understand a complex morphogenetic process without thinking about 3D shapes and tissues, and things like that. Well, in my field there are many groups that are doing that. I don’t think there are many groups that are not interested in that. I do think this is it, in the next ten years the issue is to be bringing the molecular approach and molecular perspective back into a morphogenetic perspective. Thieffry: This is going on already. Sharpe: This is what I am saying. This is going on. I don’t think that is a question anymore. Thieffry: Patrick Lemaire tried to make an input last time. Sharpe: All the funding agencies now are clear about this. They want interdisciplinary research, which means not just molecular biology. It means you have to take molecular biology and have to add some imaging, computer modeling. Some of that is just modeling single cell systems, which is not very special – you know. But a lot of it is complex developmental processing. Burian: I want to come back to this last slide, and the desirability that the schematics should disappear. The scale of what we can do now, that sort of work fills gaps that were amplified by the schematics. And once one dropped schematics out of it, I think, for an increase of developmental 

See for instance Sharpe, James, Ulf Ahlgran, Paul Berry, Bill Hill, Allyson Ross, Jacob HecksherSørensen, Richard Baldock and Duncan Davidson. 2002. “Optical Projection Tomography as a Tool for 3D Microscopy and Genetic Expression Studies”. Science 296 (5567): 541-545; Lee, Karen, Jerome Avondo, Harris Morrison, Lilian Blot, Margaret Stark, James Sharpe, Andrew Bangham and Enrico Coen 2006. Visualizing Plant Development and Gene Expression in Three Dimensions Using Optical Projection Tomography. The Plant Cell 18: 2145-2156.

112

Optical Projection Tomography: Revealing the Visible

objectivity. These are the things that have happened, and that disturbs. It revises developmental objectivity quite considerably. Sharpe: This is one example, and partly, we were talking yesterday about the difference of a series of statics and, in fact, that what actually happens in-between the pictures. In fact, for this kind of movement I would suggest this [pointing]. Here you have a series of statics and the interesting stuff happens in-between. What we are turning to now is a vector field – I mean – it is a 4-dimensional piece of data which relates to it and has an intrinsic time inside the data. So this is one example where the schematic thing is truly a fact. In simple cases the schematic issue is true as well. We increasingly, instead of drawing a schematic of the brain or whatever, use a sort of smoothing computer model of the brain and put labels on that. You still fill in adaptations, labels and arrows, but you don’t draw that thing because the real thing is better than drawing. Fiorentini: But if you are saying you try to do all that for going back to a descriptive morphology, is that all an effort to visualize, to switch off an interpretation, that means to switch yourself off as an active intervention? Sharpe: No, I just can emphasize with this [pointing to a slide]. That is, the objective image is this picture, but you don’t see anything, except what you want to see it. Then you do see. But it is not objective, I am totally happy with that. Fiorentini: That is problematic. We are coming from a group of Lorraine Daston who wrote a big history of scientific objectivity. So, it is very difficult, I mean, it is probably, you agree in your group upon what is objective, but I miss two kinds of orchestration. Sharpe: I guess what I am trying to highlight is ... Schadwinkel: It is entirely accurate. Fiorentini: But that is not valid for the 19th century. Sharpe: What I am trying to highlight is: this, in fact, is a bottom number. In a sense I do not imply that it is objective. And it remains an unknown image until you interact with it, it doesn’t even do anything, it just becomes a number. That doesn’t mean that it is objectively true. Until you are looking at it you don’t even know what is there. Fiorentini: That is very curious. But what use is it to make these two kinds of images if you say “ok, I have to interpret them”. I don’t understand, I’m not accustomed to this one, but to the other. What is that good for? In terms of objectivity? Sharpe: I didn’t mention any example of what kind of things we found that are different from the literature. But they are quite surprising, for example, that in the human development of the pancreas there is a dorsal branch and a ventral branch and they are moving in a certain way. This was described tacitly hundred years ago. Now we discovered that there is in fact a certain connection between the two that happens in a certain way. That is a very obvious cue when you know what to look for. And now we have found it, it is an advantage. As soon as you see it like this, you have a pre-model [pointing to slide]. You see something that doesn’t look like it supposed to be, that is a bit funny. Then you go back, you play around with it, let’s scan another one, you get another one, and another one, and another one. Then you go back to the literature from hundred years ago, and then the literature from 50 years ago, and there really is not such a thing as sizes or shapes, a real topological kind of activity. I mean, the question is why it takes this technique to reveal that. But I think the reason is, mainly it has to do with comprehensibility. When you do a scan like this you get everything, every piece of tissue. You render it, you haven’t physically

113

James Sharpe

dissected it, so it hasn’t fallen to pieces, and there is a whole host of reasons why. Essentially, you preserve a tissue better and you have the information about all of the tissue. Schadwinkel: I think it is interesting this objectivity – what you see is not always what is real. And does it happen that with all these data there would be a mistake in it once? Sharpe: Absolutely. Schadwinkel: And how can you recognize it? Sharpe: The whole debate about objectivity and subjectivity, and how science works. It occurs to me, and my feeling is that scientists don’t believe a lot of data. [mutterings] No, you don’t, because you can’t. I mean, what do I believe from molecular biology of the last 50 years? I certainly believe that DNA exists. And what do I believe of the research of the last 10 years of how the genes regulate? Well, until it is around and can be observed by 500 people over the next 20 years, there is still a little question. And that is the same for this, and it is the same for all this kind of data. We know about certain artefacts, but there are protozoa we don’t know absolutely about, but I would never bet a lot of money on 100% accuracy of any of these data from any lab in the last 20 years. And it doesn’t matter. The important point is, science doesn’t depend on the data being right; it depends on the gradual process of doing it, discussing and correcting these data over periods of decades. Schadwinkel: But I think what you always have to keep in mind is, you know from microscopy that there are artefacts, also in chemistry. And when you have the technical thing you think it must be the truth because the computer works. Sharpe: No, no – I am not sure that any scientist really believes any piece of data at that level. It is true there are occasionally historical debates, or scientists are arguing against each other for years and years and years. But that is not the normal character of science, it is progressive. The more people observe it, the more times it is replicated, the more true it will be. Burian: Replicated? Where do you put it in your context? Sharpe: There are still plenty of little artefacts in it. We have the next 10 years to discover them. Müller: Just a comment for the discussion. Sometimes it appears that these images done by James’ or Brian’s technology are more beautiful, more high-tech renderings of 19th century 3D constructions that we could have done anyway, so to speak. But there is a fundamental difference, and that is in the block here [pointing]. These are inherently quantitative methods and numbers, and, therefore, you can do all kinds of quantitative calculations that were not possible before. You could measure within a 3D model also, but it would have been terribly crude. Now, here you have numbers – what it means is that it depends on the X-rays and light and everything, but since they are there, and since we have all these stages, there is the possibility to interpolate between these stages and actually to get much closer to the process than it was ever possible before. Therefore, there is something in this technology that is entirely new that was not possible before the bioinformatics age. de Chadarevian: I mean, there is so much “computer” in it. Now computers make it possible to do that. It is not that scientists didn’t want to do it before, they probably would have loved to do it, but they couldn’t do it. So, maybe, we should think about what is really new here, or what just is ‘computer’ instead of speaking of computer images. What does it really mean ‘computer images’? That is the issue – what is possible now. Also molecular biologists nowadays when seeing the different molecules they use the same kind of images at the molecular level. So, I really think this

114

Optical Projection Tomography: Revealing the Visible

would be interesting to talk about: what it really means what you were talking about, the quantifying and digital image. Let’s put the technology in the center of our discussion of such images. Somehow, it seems to me the science is just moving along this, you know. Sharpe: Instead of what? de Chadarevian: Instead of speaking of objectivity. I mean because we are discussing the images, and the role of the images. And then, maybe, how scientists speak about that and how they think where is the objectivity, and what is the real thing. Sharpe: No, no – I don’t think that scientists ever worry about objectivity, ever. I have never ever discussed it before this meeting [laughter]. That is my basic point, people don’t believe 100% in their data. That doesn’t matter for scientists. de Chadarevian: What I really think is interesting here is that it looks like morphology, but it is completely different. Sharpe: There are some important differences, but I don’t think you can overplay the differences. I mean, essentially a computer manages a lot more and does the calculations. But it is still a lens, a camera, light, capturing those things and representing them. So, it is very much more powerful, it’s much bigger in quantity, you get a lot more, it is more complete. You know, it is still basic observation. One thing that Brian mentioned is that it starts with a microscope. This machine was first built from a dissecting microscope, just a normal one with which I was looking at the things. I see it as a maximal powerful tool that allows us to do things we couldn’t do before, but it doesn’t change data in nature. Serpente: Something I would like to pick up from our talks. For me there was this idea that there is no theory-ladden interpretation of data to create these images. And I just wonder how much you take from former anatomical manuals in constructing these images when you built the software for the machine. And that is somehow related to a question, a very naive question, but can you make an elephant out of that? Sharpe: With a big scanner I think I can. Serpente: But there is something that is guiding your perception and your view and what you want to construct. Sharpe: The algorithm, the reconstucting algorithm is the only thing that does not really have any subjectivity because it doesn’t do anything really, it is just a mathematical relationship between projection images and the volumes. There was a mathematical theory that was a long long time ago. But the serious thing is that you as a scientist do everything. Even before you start, you have made some choices, for example, which antibody to use. The fact is the algorithm which basically is just a mathematical formula. That’s all the algorithm is – a formula, a true formula. But the use of the fact is immediately to involve decisions. So, there are 100 antibodies to use, and which you have chosen, affects my results and they affect my conclusions. But I am perfectly aware of that. Metscher: For any kind of mathematical transformation I had the projection image to check in the solid and the transition from that to that. The data analysis was done just by a computer algorithm, and it does it the same way every time. You can calibrate that by using a known target. And I make choices about that, of course. I did choose which membrane to dissect when I made my chicken slice. That is what I show from the data. After that you can mitigate the data, you prefer the high ones, or others. But that’s just the picture.

115

James Sharpe

Sharpe: To assess the status of this algorithm – you could just think of a little camera. Forget about the algorithm, just a piece of film, go to the lens and the specimen. That is about how much choice you do have.

116

Tracing the mtDNA Trail Mait Metspalu

Taking up the issue of ‘resolution’, Mait Metspalu will present the puzzling fact that inferences based on frequencies of low resolution markers may be far from the truth. For two decades DNA sequence variation of haploid systems (mtDNA and Y chromosome) has been used to study human population history and phylogeography. Recombination-free uniparental mode of inheritance allows one to reconstruct a true hierarchical genealogy that can give insights into population origins and movements. His genetic and anthropological case study focuses on the Munda speakers of India who share a substantial part of their Y chromosome pool with Southeast (SE) Asians. On the other hand, no obvious trace of SE Asian ancestry has been found in their maternal gene pool (mtDNA). When studying the mtDNA traces they had to completely overthrow the initial potential interpretation. Mait Metspalu’s objective here is to illustrate how different resolutions of looking at DNA sequence variation among populations can produce alternate rather than more detailed interpretations.

Discussion of Metspalu’s talk de Chadarevian: It’s not a question, but I mean, these maps are interesting. What role do these maps play in your research? Is it really that you get new insights out of it, or is it just representing what you’ve put into it? Metspalu: It will be both. Basically this is illustration, but on the other hand it represents, let’s say, spatial frequency data. You have the red dots representing the populations from where the data came. So, imagine a table, that would be the correct way to represent frequency. But it would be too big, too complex, and people would not comprehend it. You need to put it on the map just to see, “aha, ok, we concentrate on Poland and start from there”. Therefore, it is an illustration, but because part of the result we will get is a narrative, what you see is the summary of such a narrative. Fiorentini: I have a question about the map, because I didn’t really understand it. The distribution, that’s clear but what about the time layer in this map? Metspalu: The time layer is better in this map [pointing to slide] Fiorentini: Ok, you have to make different ones? Metspalu: No, that depends on how you design it. Here you have the time layer, that is the expansion time of this lineage. But in terms of the narrative this image is better, because the scale is time, and the colors of the arrows, they sort of represent the time depths of the events that are depicted.



Tamm, Erika, Toomas Kivisild, Maere Reidla, Mait Metspalu, David Glenn Smith, Connie J. Mulligan, Claudio M. Bravi, Olga Rickards, Cristina Martinez-Labarga, Elsa K. Khusnutdinova, Sardana A. Fedorova, Maria V. Golubenko, Vadim A. Stepanov, Marina A. Gubina, Sergey I. Zhadanov, Ludmila P. Ossipova, Larisa Damba, Mikhail I. Voevoda, Jose E. Dipierri, Richard Villems, Ripan S. Malhi. 2007. Beringian Standstill and Spread of Native American Founders. PLoS ONE 2(9): e829. doi:10.1371/journal. pone.0000829.

117

Mait Metspalu



118

Figure 1: Branches encompassing Native Americans and their immediate Asian ancestral and sister lineages, represented by complete sequences, are shown in black with coalescence ages indicated and geographic location identified by colours. Lineages in brown correspond to the main haplogroups, found in Eurasia and Oceania, but absent in Native Americans (Tamm et al. 2007, fig. 1, p. 3).

Tracing the mtDNA Trail

de Chadarevian: But that’s the problem: that you can represent time only, but you can’t represent both? Metspalu: Bascially you can, but – de Chadarevian: But on the next map – either you show the red or you show the green… Metspalu: Here, I always thought that the information density here is too high to start with, putting these arrows over here or – just the readability of the –. Bruhn: Just something to add, because you have shown several models just to display information. Therefore, there are different alternatives and there were similar attempts to that in the 19th century, for example, to combine trees of all kinds, linguistic trees with evolutionary schemes with color. And all that was done to combine the dependencies, or relationships to spaces, or times. One kept diagrams that used color – and that is a particular point – to combine an object’s structure, or some sort of relationship to space or to time lines. And I find it very interesting that in this respect, or as James did say, that sometimes the diagram, a visualization that uses color can be more abstract, or more clear than a linear diagram which you did put at the end of your discussion to demonstrate what you really mean. So, sometimes I think that color is still underestimated. And you can say that we are talking here about 3D and 2D. But one could also say that color is a dimension, not in the mathematical or geometrical sense, but as another dimension. And one common thing is, in this respect, you showed different maps and they all have a different semantics as far as I can see, there is no code like red always means the past or old, or blue always means density, or anything like that. I mean, every map develops an overall semantics. And I find that very interesting to study it as an object of study – and I’ll do that. Burian: I more or less want to amplifying on an aspect of the same comment. You said that the same information was already in the tables, but for pattern detection devices such as human eyes that is not true. I think the maps provide – and color is one of the devices there – a pattern detection device that we can use in some ways to develop hypotheses that can’t be done without it. That is an investigative tool that goes quite beyond what you suggested in the verbal description of what you do with the maps. de Chadarevian: I’m just interested, are these research tools, or do you just draw them to show to us? Metspalu: Basicially this is an illustration, we just draw it to better visualize the pattern. As I said, from the table you don’t see the pattern, you don’t recognize it. Fiorentini: That means your conclusions are from the tables, not from the map. Metspalu: No, no – after doing this, you inevitably will do the conclusion from the picture. Because you’ve seen it. Johns Schloegel: You said now we focus on Poland, you could see there was something interesting in Poland. So that was part of your research tool, it drove you to make a new choice and look to Poland. Metspalu: Yes, when the dataset gets really big, initially it is really the tool where you do this map quickly, and see where it is. Then you can look further, let’s say into the diversity and the diversity differences in different regions. And you see whether this is the same, or just like the example of the Austro-Asiatic speakers. As it is not the same, you have the frequency peak, but not the diversity peak.

119

Patterns of Perception

Embryos and Empathy: The Model Experiments of Wilhelm His Thomas Brandstetter

I would like to start with an episode from the life of the Swiss geologist Albert Heim as told by himself. When he was ten years old, his father took him for a walk in the mountains, and upon returning, young Albert immediately set to modelling what had impressed him so much: I myself got the idea to imitate the mountains in small scale, and I tried by sticking together pebbles: sugar made up for snow; I had never before seen a relief model. The joy of mountains was roused to passion. (in Brockmann-Jerosch 1952, p. 138.)

For years afterward he delighted in constructing relief models, and finally he recognized that “one has to understand the structure and formation of a mountain to model it correctly.” Heim concluded his recollections by acknowledging that “aspiring after a relief model has lead me to geology first of all, and the relief model has made me a geologist.” (in Brockmann-Jerosch 1952, p. 138.) This narrative of one of the most famous geologists of his time is phrased like a confession, a short tale revealing the hidden drive behind Heim’s scientific achievements: the naive, boyish urge to imitate what he had seen, to rebuild at a small scale what had filled him with wonder in nature. However, this ‘primordial scene’ not only lays the framework for a mimetic desire that would never leave Heim (he was famous for building brilliant relief models of Swiss mountain chains), it also exemplifies a problem at the very heart of the scientific practice of modelling. Heim contraposes two approaches to geology and embeds them into a tale of personal development: the naive, childish imitation of the relief and the mature, thorough knowledge of structure and formation. Child and scientist, surface and profoundness, appearance and cause: at first glance, this seems to be an easy dichotomy, signifying the divide between everyday perception and the construction of an epistemic object. However, the issue is much more complicated. First of all, the mimetic desire is not an obstacle but the trigger for becoming a scientist and for inquiring about the real causes of phenomena, and second, Heim never gave up building relief models, and he stressed their value for the education of students and the lay public. It seems therefore that the dichotomy does not signify a clear divide between nonscientific activities and science, but rather it hints at a zone of indetermination within science itself. Questions concerning the relationship between representing and experimenting emerge: What is an adequate representation? How can a model represent the world? In what way can a model be used to actually generate new knowledge? It is fascinating to observe how such issues seemed to haunt nineteenth and early twentieth century scientists trying to make use of models. Take, for example, the embryologist Wilhelm His. His is generally acknowledged to have introduced the physiological method into embryology, advocating a mechanist approach that strives to identify the exact causes of each developmental stage. The aim of Entwicklungsmechanik, as the new discipline came to be called, was to derive every form in a way “that each stage with all its characteristics can be understood as a necessary consequence of the preceding one” (His 1874, p. 2). To achieve this, he reverted to simple mechanical causes such as the irregular rate of tissue growth, which, according to His, results in 

For a general overview of the history of embryology, see Mocek 1998; Maienschein 1994. In several texts, Nick Hopwood has provided a new insight into His’s work that forms the basis of my interpretation.

123

Thomas Brandstetter

the folding of the germinal disk and thereby in the development of the characteristic forms of the embryo. Since his early years at university, His had shown an interest in geology. He admired the relief models created by Albert Heim and recommended the method of Plattenmodellierung employed by the geologist as an apt method for modelling wax embryos. Several times he drew attention to the parallels between geology and embryology, stressing how embryology could improve its terminology by taking lessons from the other science because, as he stated in a letter to the marine biologist John Murray: The results of geological observations concur in many aspects with embryological ones; the displacements and ruptures of strata in extended masses follow the same laws in the formation of the earth’s crust as in the formation of our own body. (His 1889, p. 496; see also His 1894, p. 3.)

His therefore not only modelled the embryonic forms as static wax figures, he also conducted experiments on the formation of displacements and folds. Taking up the practice of model experimentation as introduced by geologists such as James Hall, Auguste Daubrée, and Alphonse Favre, His started a series of inquests into the laws of folding. By plying bands of different materials such as tin, sheet brass, paper, gelatin, and others, he identified basic types of folds, and by compressing discs of rubber he achieved a rough simulation of processes taking place in the early stages of the germinal disk (His 1894, pp. 7-36). His aim was to demonstrate that purely mechanical causes, in particular the displacement of parts by the uneven rate of growth of the germinal disk, are responsible for the development of the complicated forms of the embryo. Like Heim, His wanted to shed light on the causes bringing about certain shapes, and like Heim he described the process of inquiry as a move from surface to depth: Departing from the observation of embryonic forms, I have been led [...] by an immediate intuition to inquire into the relationship between the forms and their causes; that had naturally prompted me to compare the forms observable in germinal disks and young embryos with similar ones which can be produced by bending elastic matter. (His 1894, p. 49.)

This narrative shows the same structure as Heim’s. The process of scientific inquiry is depicted as a transition from the observation of the outer form (as mimicked by wax models) to the probing of the inner causes (as reproduced by the experiments on folding). The process could also follow the opposite direction. His not only experimented with folding bands, he also devised simple leather and lead models that combined both aspects. They were results of mechanical processes as well as representations of forms (His 1887, p. 383f). However, as he wished to render the forms in more detail, he turned to Adolf Ziegler, who had already begun to produce wax figures in the 1850s. Their co-operation resulted in the famous models that acquired an almost iconic status for the growing discipline of embryology (Hopwood 2004, 2002). While the experiments were designed to mimic morphogenetic processes, the plastic reconstructions visualized three-dimensional structures. However, Nick Hopwood has shown that in the methodology of His, both types of models, the static ceroplastic figures and the experimental simulation of processes, were intimately connected. They were “the ends of a spectrum of uses” (Hopwood 1999, p. 484). His had developed both methods in parallel, and he often used the wax models to argue for the validity of mechanical principles. For him, modelling 



His had attended lectures in geology by Bernhard Studer at the University of Bern in 1850 (His 1965, p. 28). For references to Heim, see His 1885, p. 5; His 1887, p. 388; His 1894. These experiments resemble those of Auguste Daubrée as described in Daubrée 1880, p. 223. For an overview on geological modelling, see Oreskes 2007.

124

Embryos and Empathy: The Model Experiments of Wilhelm His

was crucial for understanding mechanical principles, and he once stated that he was brought to a mechanistic understanding of development “not by abstract reasoning, but by the empirical results of my first steps in modelling” (His 1894, p. 2).



Figure 1: Chick development as the folding of a lead plate (ca. 1866-1867). From Hopwood 2002, p. 44. Copyright by University Archive of Basle.

Why was modelling so highly regarded by His? It is important to note that the value of models derived from producing them, not from mere contemplation. For His, the act of making a model was crucial. This practice can be understood as driven by a mimetic desire whose primary aim was the emergence of a form in and on a material substrate (Didi-Huberman 1999, p. 18). In the embryonic models, the form oscillated between being a figure that only looked like something (the ceroplastic model) and being a phenomenon that actually was something (the folds brought about by the same mechanical causes in bands of brass as in the germinal disk). The crucial point valid for both paths is that before the models could become representations, they were traces of a practice that preceded their function as symbols or instruments. Thereby, the models were at the same time technical objects, epistemic objects, and aesthetic objects. They were technical objects insofar as they were products of gestures and devices; they were epistemic objects insofar as they generated new knowledge; and they were aesthetic objects insofar as they exhibited the nascency of form. In the rest of my paper I will argue that the aesthetic dimension of His’s models has a distinct location in nineteenth century discourse: the so-called aesthetics of empathy, or Einfühlungsästhetik.

Empathy Theories of empathy or Einfühlung were part of the “physiological idealism” of nineteenth century psychology. In this tradition, perception was understood to be an active act of the subject, a result of physiological and mental processes of the observer who projects his own sensations onto the world outside (Müller-Tamm 2005, p. 11). The basis for this theory was established by experimentalists such as Johannes Müller, Hermann von Helmholtz, and Wilhelm Wundt, but its potential was soon recognized by specialists in different fields including aesthetics, the philosophy of technology, and anthropology. Wilhelm His had learned of this approach through the writings 

I will use the German word Einfühlung because in the English “empathy,” the ethical connotation seems far stronger; Einfühlung in the nineteenth century meant primarily nothing more than a psychological or mental act.

125

Thomas Brandstetter

of Hermann Lotze, whom he admired greatly (His 1965, p. 39). He not only devoured the early physiological work of Lotze that had paved the way for a causal and mechanistic approach to medicine, but also praised his philosophical chief work Mikrokosmus for its “wealth of ideas, sharp criticism, subtle psychological observation, and excellent style” (His 1870, p. 128). However, this direct reception of Lotze’s approach provides only a superficial hint at a much deeper intertwining of His’s practice of modelling and the theory of empathy. I want to argue that Einfühlung provides the epistemic framework for the peculiar approach to modelling taken by scientists such as His and Heim, and that an investigation of this notion helps to understand their rationale and their aims. For Lotze, as for theorists such as Friedrich Theodor Vischer and his son Robert Vischer, aesthetics was to be based solely on the receptive faculty of the subject. However, reception was no passive event but an active process of projecting oneself into the objects of the world outside: “My mental and sensual self transposes itself into the inside of the object and feels its peculiar form from there” (Vischer 1927, p. 48). This feat was achieved by a “motoric gaze,” a gaze that followed the shapes and movements of the object and thereby induced an agitation of the nerves, resulting in an “inner imitation” of the form perceived (Vischer 1927). The crucial thing is that this imitation consisted of a reconstitution of the composition of the form, and that this reconstitution was understood to be a somatic act involving the whole of the body. Reception and production were intimately connected. As the nerves were already agitated by reception, and the inner imitation was accompanied by a corresponding twitch of muscles and limbs, the inscription of these movements onto a surface to produce an image or a sculpture was only a minimal step (Vischer 1927, p. 35). When Robert Vischer described the process of inner imitation, the example he used was a landscape. His description evokes images of the process of relief-making as characterized by Albert Heim: The observer uses the sensations acquired from nature: to produce the image anew and to own it entirely. He builds and layers the hills and mountains, he spreads them with woods, he raises trees, branches them out and covers them with leaves, deposits meadows, expands the plains, [...] attunes it in form and colour to a harmonious new creation. (Vischer 1927b, p. 63.)

Empathic perception and modelling shared a structural parallel as they both reproduced an object by reconstituting and shaping its form. Although Vischer stressed that man does not use geological forces when creating the landscape, he still acknowledged that empathy consists of “an imitation, albeit dark, of the acting forces of nature” (Vischer 1927b, p. 63). This kept the epistemic status of empathy in suspense. Proponents of this theory were by no means mystics; they always stressed that nature could only be explained by a mechanical approach (Perpeet 1966, p. 203ff). However, Goethe had already stressed that knowledge about the forces acting in nature was necessary for the aesthetic rendering of an object, thereby establishing a common ground for aesthetic and scientific ways of seeing; and his conviction, as expressed in Wilhelm Meisters Wanderjahre, proved to be influential for the practice of modelling in the mid-nineteenth century (Hopwood 2007). Therefore, a strict opposition between aesthetic and scientific approaches seems to miss the point when regarding modelling in the nineteenth century. This problem was still concerning His, who defined form as “the spatial disposition of the parts of an entity that are perceivable for our senses (at first for the eye and then for the sense of touch)” (His 1874, p. 147). Form was foremost a category of perception and thereby, following contemporary psychology, a product of the projective   

For His’s reception of Lotze, see also Mocek 1998, p. 116f. For a brief overview on Lotze’s philosophical position, see Orth 1986. For Einfühlungsästhetik in general see Müller-Tamm 2005, pp. 214-248 and Perpeet 1966.

126

Embryos and Empathy: The Model Experiments of Wilhelm His

function of the subject. This renders the relationship between form and formative causes problematic, as form is a product of the subject’s mind as well as the forces acting in nature. These forces, however, can not be deduced by perceiving form, which offers us almost no hints about the underlying processes; also our instruments can only visualize the modifications of form over time, but not their causes (His 1874, p. 147). As I have already argued, empathy could provide a means to render this uncertainty between aesthetics and epistemology productive. His argued that embryologists must not rely solely on analytical approaches such as the cutting up of the embryo with the help of the microtome, which offers a plethora of detailed cross-sections, but does not contribute to an understanding of the whole. He even denied the scientific value and probative force of thin sections, arguing instead for a reconstruction of the form as a three-dimensional object (His 1885, p. 3f; His 1889, p. 487). Such a plastic synthesis of the dissected entity could be achieved by fabricating wax models or conducting model experiments (His 1885, p. 7). In any case, His deemed it necessary to come to an understanding of the embryo as a whole. However, such an understanding could only be achieved by actually making and manipulating the model, that is, by the interactive process of producing the form. This was the reason for his rejection of the method of Gustav Born, who had suggested a much speedier way of reconstructing models by layers of wax plates whose outlines were copied from the microscopic image (Hopwood 1999, p. 487). The value of a manual reproduction of the embryo’s form lay for His in the bodily interaction with the object. He talked about the “bodily intuition” and how understanding can only be achieved by involving not just vision, but also the tactile sense – just as described by the theoreticians of Einfühlung (His 1887, p. 382). Even his phrasing alludes to theorems of empathy. He talks about thinking one’s own way into the form (His 1887, p. 386), and in an early text he even uses the reflexive grammar Robert Vischer identified as central for an aesthetics of empathy: “the growing cartilage produces its own perichondrium, the eye its own capsule and the blood vessel its coat” (His 1865, p. 27). With this, His of course wanted to show that proximate, mechanical causes are sufficient to explain the formation of organs; but it is interesting to note that Vischer uses the same grammatical form to characterize the empathic perception of nature. For him, the use of the reflexive indicated “the foisted subject, the intruded self of the observer” (Vischer 1927b, p. 64). Even His’s emphasis on manual handling of the model and his defense of modelling freehanded can be traced back to the doctrine of empathy, especially to the writings of Hermann Lotze. For Lotze, empathy was not only an aesthetic category but constitutive for any action that establishes a relationship between the subject and the world (Lotze 1905, p. 203). The medium for such an interaction was the hand, which allows the subject to project itself into the object. The hand feels and manipulates the object, thereby acquiring knowledge about its properties. During this process, we get the idea of “an inner lawful coherency which links the properties to certain conditions” (Lotze 1905, p. 203). Tactile manipulation enables us to overcome the dichotomy between the outer shape – the surface of the object – and its inner nature. Lotze stressed that the medium, the organ or tool, is forgotten when manipulating an object. This “beneficial delusion” causes our soul to be directly present at the tips of our fingers or the point of the needle and enables the subject to project itself into the entity it examines (Lotze 1905, p. 207). We can now see the phantasm that stood behind His’s practice of modelling: immediacy. Imitation by producing models was to be a means to be inside the embryo and, as it were, to 

I think that the notion of Anschauung, whose central role for His is also stressed by Hopwood, is nearer to the aesthetic than to the mathematical discussions of intuition versus formalism. In mathematics, intuition mainly carried an ontological claim about the existence of the objects outside the pure formalism of mathematical language, while for His, intuition was a category of a physiological and psychological epistemology.

127

Thomas Brandstetter

become the process of formation itself. The material analogue to this phantasm was wax, which His used not only for his static models, but also for some of his model experiments (His 1894, p. 9). Since antiquity, wax had been “the incomparable material of organic resemblances” as its plasticity rendered it perfect for the imitation of biological entities, creating the impression of actually becoming living matter itself (Didi-Huberman 2004, p. 66). His spent a lot of time learning to handle the soft, pliable material, and he was aware of the intimate closeness between hand and wax, a closeness that renders any strict mechanical objectivity impossible. He was skeptical of the easier and faster method of plate modelling as advocated by Born, arguing that Born underestimated the difficulties of modelling and that even with this method, the finishing of the model could only be done by hand: “as soon as one models with the soft material, the precut plates lose their relevance as authoritative originals, so one has to revert to the tools of freehand modelling anyway” (His 1885, p. 5). That is what it means to be inside an embryo: to grasp its form from the inside and thereby to reconcile the dichotomy between visible form and underlying causes, which in this case also meant reconciling the naive realism of the apparent aspect of the model with its function as an epistemic object within the specialized discourse of Entwicklungsmechanik. The fabrication of wax models as well as the mimetic experiments strove to imitate the embryo as a “figuring figure,” a figure that is about to appear but is not yet classifiable as either an aesthetic or an epistemic object. Mimetic fabrications such as these are neither true experiments, which produce phenomena, nor representations, which produce images; they are impure entities, which do not allow for an easy separation of artefact and fact. This might be the reason for the difficulties history of science seems to have had with these objects; however, under the paradigm of computer simulation and its strange status in-between theory, experiment, and representation, a new look at mimetic practices might be rewarding.10

Finis I would like to close this paper with some general remarks. First, I think that the role of empathy in science, especially in nineteenth and early twentieth century sciences, deserves a closer investigation, and it seems that this is not only restricted to the so-called morphological disciplines. It is interesting to see how empathy seemed to pop up at the very moment when the epistemological grounds of scientific methodology were shaken by a crisis that denied the possibility of a direct alignment between subject and object (Cassirer 1957, p. 89ff). Maxwell, who himself was an important proponent of the new movement that departed from ontological questions, favoring instead a sort of preliminary epistemology based on the method of analogy, once talked about a peculiar form of doing physics that he himself seems to have endorsed: Others, again, are not content unless they can project their whole physical energies into the scene which they conjure up. They learn at what rate the planets rush through space, and they experience a delightful feeling of exhilaration. They calculate the forces with which the heavenly bodies pull at one another, and they feel their own muscles straining with the effort. (Maxwell 1890, p. 220; see Cat 2001.)  10

I take the notion of “figuring figure” from Didi-Huberman 2001, p. 99. Apart from the works of Nick Hopwood and Peter Galison, mimetic experiments have not roused much interest in the history of science, although around 1900 they had been conducted in such different disciplines as cosmology, meteorology, geology, biology, and the engineering sciences (Galison & Assmus 1989). Interestingly, historians of art have recently shown a profound interest in wax models (Panzanelli 2008).

128

Embryos and Empathy: The Model Experiments of Wilhelm His

Using even the buzzword “projection,” Maxwell seems to advocate a practice of calculation that was supposed to offer an intimate relationship between the subject and his object, a relationship that involved the whole of the scientist’s body and reconciles the observer with the world. Secondly, it seems that a mimetic or hyper-realist desire is a constitutive part of modelling. Even if it is true what Georges Canguilhem once remarked, that “a bad model, in the history of science, is that which the imagination evaluates as a good one” (Canguilhem 1963, p. 517), mimesis returns at least in the form of small details in many three-dimensional models, from the skincoated electric fish of Cavendish to the tortoise-shaped robots of Grey Walter.11 These hyper-realist elements put into motion a dialectic between the common perception and the epistemic construction of an object, a dialectic that is not easily subsumed under the progressive movement of polemical reasoning as described by Gaston Bachelard, who argues that a rupture with common perception is a necessary precondition for the process of science. The pragmatic approach that holds that models are valuable insofar as they work successfully inside a discipline seems to fall short of the unsettling dimensions models can have. As I have tried to argue, the very nature of the practice of modelling renders them technical objects, epistemic objects, and aesthetic objects in equal measure, and this indeterminateness leads us to question what scientific method is, or what practice qualifies as being scientific in the first place.

Discussion Perini: I wanted to pick up at the question you closed on – that there is such an issue of direction of representation that produces an image, or one to an experiment that produces phenomena. But I have the opposite reaction – that it is both, that it is a representation and a phenomenon, and that is what you get out of it. So, could you say a little bit more about what you think the problem is and why it is not those things? Brandstetter: Yes, I think the difficult problem arises when His pushes the plates together and then acquires a specific form. If he pushes them together in a certain way he acquires forms that look like embryo forms. When he starts, of course, he wants to arrive at a certain image, a representation of how the embryo shapes, but he does it by using the same forces – mechanical forces, because he believed that only mechanical forces were responsible for the formation of the embryo – the same forces as nature has used in folding embryos. And I think this would be an experiment. One could probably call also them auto-graphical images or something like that. Sgaier: I am still curious what His’s problem is – the act of modelling helping us to perceive the embryo and your hint to the hands, in that sense? I am just curious with something you came up in the end: What do you think His would make of the new currency, the computer methodology that we use, the OPT images that we saw earlier? Because to me that kind of technology really helps me to perceive the embryo in a different way as I was able before. And this whole idea of being inside the embryo when I see, you know, those computer images, I can actually picture myself inside the embryo and feel the form better than before. Brandstetter: I can’t speak for His, probably he would be against it, because he didn’t like automated pre-modelling, which at his time was a sort of projection method for producing models. I think it has to do with his conception of empathy that involves the bodily movement of the hands. If you only have a gaze – like if you are looking at the computer screen – then for His this would probably not be the same thing because there is no immersion of the whole body.

11

For these examples, see Rieger & Bühler 2006, pp. 209-220 and 265-278. Not to mention computer simulation, where a lot of time and money is put into the graphical rendering of the results.

129

Thomas Brandstetter

Wall: How did this simulating work, how did he describe it? Because I somehow feel that in practice you have to have a certain image, either from a representation, or a microscopic picture that first of all you start out [with]. You start your folding and at a certain point you have to stop your modelling. I was wondering, thinking about these stages that he fixed somehow. This is leather, it has to be fixed at a certain point. Did he describe it? When did he know that he had found a form? Because the tables he did were normative stages. Did he define a certain set of appearances as a normative appearance? This seems to me different from the idea of empathy getting inside the form, because it is an individual form, and you don’t have necessarily that every individual is forming on that route. How did he describe these simulations? Brandstetter: His already had an aim in his mind, from his microscopic image or whatever, so he tried to arrive at certain stages that were known, or were accepted as stages to show that mechanical forces could produce such shapes. I am not quite sure whether this contradicts empathy. Actually, it’s only a way of perceiving. With empathy you could feel yourself into all kinds of things. There are very extensive descriptions in Vischer’s books – becoming a cloud moving over the landscape, and being a tree. Metscher: In a simply way [with] the folding of these embryos he is actually making a thought process that does produce that form. And in finding those forces – whether you do it, at best, with your own hands, or another material, or any other machine, or in effect, when you do it graphically with a computer itself, being the forces to make it do that – you have simulated a process, you have simulated a field of forces, and that does work like the kind of model that we are interested in doing now actually – modelling the capital forces that cause the neurotization and so on. He didn’t go far enough, so far as to simulate the materials, the tension and strength and all that stuff, but he did actually simulate one of the mechanical forces that we now are involved in epigenesis, the model of that process. Anderson: Actually, I would like to add something. In the last chapter of Nicolas Rasmussen’s book Picture Control, he talks about that he has found that sort of material where an electron microscopist talked about of entering the micrographs, and landscapes.

130

Embryos and Empathy: The Model Experiments of Wilhelm His

Bibliography Brockmann-Jerosch, Marie, Arnold Heim, and Helene Heim. 1952. Albert Heim. Leben und Forschung. Basel: Wepf & Co. Canguilhem, Georges. 1963. “The Role of Analogies and Models in Biological Discovery”. In Alistair C. Crombie (ed.), Scientific Change. London: Heinemann, p. 507-520. Cassirer, Ernst. 1957. Das Erkenntnisproblem in der Philosophie und Wissenschaft der neueren Zeit. Von Hegels Tod bis zur Gegenwart (1832-1932). Stuttgart: W. Kohlhammer. Cat, Jordi. 2001. “On understanding: Maxwell on the methods of illustration and scientific metaphor”. Studies in History and Philosophy of Modern Physics 32B: 395-441. Daubrèe, Auguste. 1880. Synthetische Studien zur Experimentalgeologie. Braunschweig: Vieweg. Didi-Huberman, Georges. 1999. Ähnlichkeit und Berührung. Archäologie, Anachronismus und Modernität des Abdrucks. Köln: DuMont. Didi-Huberman, Georges. 2001. phasmes. Essays über Erscheinungen. Köln: DuMont. Didi-Huberman, Georges. 2004. “Wax Flesh, Vicious Circles”. In Monika von Düring (ed.), Encyclopaedia Anatomica. A Complete Collection of Anatomical Waxes. Köln u.a.: Taschen, p. 64-74. Galison, Peter Louis and Alexi Assmus. 1989. “Artificial Clouds, Real Particles”. In David Gooding (ed.) The Uses of Experiment. Studies in the Natural Sciences. Cambridge: Cambridge University Press. p. 225-274. His, Wilhelm. 1865. Die Häute und Höhlen des Körpers. Academisches Programm. Basel: Schweighauserische Universitäts-Buchdruckerei. His, Wilhelm. 1870. “Rezension von Lotzes ‘Mikrokosmus’”. Archiv für Anthropologie 4: 127-128. His, Wilhelm. 1874. Unsere Körperform und das physiologische Problem ihrer Entstehung. Briefe an einen befreundeten Naturforscher. Leipzig: Vogel. His, Wilhelm. 1885. Anatomie menschlicher Embryonen. Band 3. Zur Geschichte der Organe. Leipzig: Vogel. His, Wilhelm. 1887. “Über die Methode der plastischen Rekonstruktion und über deren Bedeutung für Anatomie u. Entwicklungsgeschichte”. Anatomischer Anzeiger 2: 382-394. His, Wilhelm. 1889. “Ueber die Principien der thierischen Morphologie”. Naturwissenschaftliche Rundschau 4: 485-487, and 495-498. His, Wilhelm. 1894. “Ueber mechanische Grundvorgänge thierischer Formbildung”. Archiv für Anatomie und Physiologie/Anatomische Abteilung: 1-80. His, Wilhelm. 1965. “Lebenserinnerungen”. In Eugen Ludwig (ed.) Lebenserinnerungen und ausgewählte Schriften. Bern: Huber, p. 13-57. Hopwood, Nick. 1999. “‘Giving Body’ to Embryos: Modeling, Mechanism, and the Microtome in Late Nineteenth-Century Anatomy”. Isis 90: 462-496. Hopwood, Nick. 2002. Embryos in wax. Cambridge: Whipple Museum of the History of Science.

131

Thomas Brandstetter

Hopwood, Nick. 2004. “Plastic Publishing in Embryology”. In Soraya de Chadarevian and Nick Hopwood (ed.) Models. The Third Dimension of Science. Stanford, Cal.: Stanford University Press. Hopwood, Nick. 2007. “Artist versus Anatomist, Models against Dissection: Paul Zeiller of Munich and the Revolution of 1848”. Medical History 51: 279-308. Lotze, Hermann. 1905. Mikrokosmus. Zweiter Band. Leipzig: Hirzel. Maienschein, Jane. 1994. “The Origins of Entwicklungsmechanik”. In Scott F. Gilbert (ed.) A Conceptual History of Modern Embryology. Baltimore: Johns Hopkins University Press, p. 43-61. Maxwell, James Clerk. 1890. “Adress to the Mathematical and Physical Sections of the British Association”. In The Scientific Papers of James Clerk Maxwell. Vol. II. Cambridge: Cambridge University Press, p. 215-229. Mocek, Reinhard. 1998. Die werdende Form. Eine Geschichte der kausalen Morphologie. Marburg/ Lahn: Basilisken Presse. Müller-Tamm, Jutta. 2005. Abstraktion als Einfühlung. Zur Denkfigur der Projektion in Psychophysiologie, Kulturtheorie, Ästhetik und Literatur der frühen Moderne. Freiburg: Rombach. Oreskes, Naomi. 2007. “From Scaling to Simulation: Changing Meanings and Ambitions of Models in Geology”. In Angela N. H. Creager, Elizabeth Lunbeck, M. Norton Wise (ed.) Science Without Laws. Model Systems, Cases, Exemplary Narratives. Durham/London: Duke University Press, p. 93-124. Orth, E. W. 1986. “Rudolf Hermann Lotze: Das Ganze unseres Welt- und Selbstverständnisses”. In: Speck, J. (ed.). Grundprobleme der grossen Philosophen. Philosophie der Neuzeit IV. Göttingen: Vandenhoeck & Ruprecht. Panzanelli, Roberta (Hrsg.). 2008. Ephemeral Bodies. Wax Sculpture and the Human Figure. Los Angeles: Getty Publications. Perpeet, Wilhelm. 1966. “Historisches und Systematisches zur Einfühlungsästhetik”. Zeitschrift für Ästhetik und allgemeine Kunstwissenschaft 11: 193-216. Rieger, Stefan und Benjamin Bühler. 2006. Vom Übertier. Ein Bestiarium des Wissens. Frankfurt/ Main: Suhrkamp. Vischer, Robert. 1927. “Der ästhetische Akt und die reine Form”. In Drei Schriften zum ästhetischen Formproblem. Halle a. S.: Niemeyer, p. 45-54. Vischer, Robert. 1927b. “Über ästhetische Naturbetrachtung”. In Drei Schriften zum ästhetischen Formproblem. Halle a. S.: Niemeyer, p. 55-76.

132

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look Erna Fiorentini

The protagonist of Santiago Ramón y Cajal’s (1852-1934) novel The Corrected Pessimist, Juan Fernandez, realized upon waking up one day that his eyes had suddenly been turned into microscopes. After several vicissitudes connected with his enhanced visual sensibility, he reflected: In the organic world the impression of ugliness and repugnance comes from our inopportune looks at its constitutive elements (cells, fibres, membranes, appendices, etc.). [Though] in all things there is something beautiful and attractive. It’s all a question of placing oneself at an adequate point of view. (Ramón y Cajal 2001, p. 151)

This passage is intriguing because – besides addressing the aesthetic aspects of perception – it summarizes Cajal’s fundamental attitude towards microscopic vision, namely that controlled seeing broadens in every sense the horizon of understanding. This attitude is the very foundation of Cajal’s ‘histological look’ and of the methods of visualization derived from it. Cajal’s method represents a crucial step in the history of microscopic observation and visualization. First of all in a historical sense, because as a visually oriented approach, it was very much at odds with the general disposition of the time when Cajal began what he termed his honeymoon with the microscope (Ramón y Cajal 1988, p. 252) as a young assistant in Saragossa in 1877. Cajal remembers having been: excessively surprised by the almost total absence of objective curiosity on the part of our professors, who spent their time talking to us at great length about healthy and diseased cells without making the slightest effort to become acquainted by sight with [them]… academic reactionists, never willing to muddle [their] mind by looking through the ocular of a magnifying instrument. (Ramón y Cajal 1988, p. 252)

Secondly, Cajal’s attitude towards visuality and representation is exceptional because of its methodological implications related to the linking of perception and objects. In Cajal’s conception, namely, to regulate the view of microscopic objects, and thereby one’s understanding of them, necessarily involved optimizing the visibility and the visualization of their three-dimensional structures. Regulating the glance, in fact, induced an enhanced perceptual ability to discriminate that, in turn, allowed a refined visualization of the observation’s results. To reach this optimum of visibility and visualization, Cajal adopted a unique method based on a highly elaborated multiplicity of procedures that corresponded to his manifold skills: his expertise in specimen preparation, his trained sharpness of visual judgment, and his artistic talent for visually presenting the results of his observations. The intriguing aspect of Cajal’s method was that this multiplicity functioned in both dimensions of visualization that are implicit in histology. First, in the specimen dimension, where the most urgent problem is making structures inside the three-dimensional object itself visually available for investigation; second, in the image dimension, where the histologist must transfer the perception and analysis of three-dimensional structures into images by means of a rather rigid two-dimensional system such as drawing or photography. To come to terms with these challenges, Cajal followed two main strategies that were tightly intertwined: selectivity and assemblage. The need for selectivity concerns first the very object the

133

Erna Fiorentini

histologist must investigate. Left in their natural state, in fact, histological specimens remain completely opaque, as Cajal noted: Structures of formidable complexity appear under the microscope with the colourlessness and the simplicity of architecture of a mass of jelly. The other natural sciences are more fortunate in that they work with objects of study which are directly accessible to the senses. Only histology and bacteriology are obliged to fulfil the preliminary and difficult task of making visible their special objects of study before they can commence the work of analysis. (Ramón y Cajal 1988, pp. 527-528)

Thus, to even make visible the structures in a thick section of, say, a human cerebral cortex, for instance a pyramidal cell (Fig. 1), to visualize their “precise arrangement and their relations with other, extracellular structures,” it was necessary to find a staining method that would be highly selective for the framework (Ramón y Cajal 1988, p. 520). To this end, Cajal optimized the silversalt staining method introduced by Golgi, using “purely and simply hot, free nitrate of silver, capable of being precipitated by physical processes on the neurofibrillar skeleton” (Ramón y Cajal 1988, p. 523). With this method, to his “delight and surprise,” he was able: to make neurofibrils of almost all nerve cells besides numerous types of axonic terminal arborizations, [appear] splendidly impregnated with a brown, black and brick red colour and perfectly transparent. (Ramón y Cajal 1988, p. 523)

However, the ideal selective visualization Cajal obtained in his samples was more than a means to simply show what was hidden in the cells; for him, differential visualization was a means to refine the investigator’s perception and with it his capacity for judgment. Thus Cajal explicitly stated that, for the histologist, a “strictly differential [staining] technique was something like the acquisition of a new sense directed towards the unknown,” and that selective staining was not simply instrumental in making visible the complex cellular edifice, but also essential to improving the sensibility of the observer to “interesting and unexpected details of structure” (Ramón y Cajal 1988, pp. 526, 529). The step following the selective preparation of the samples, which was thought to broaden the perceptual spectrum of the observer, was to visualize these amplified observations via imaging. Here, Cajal applied the selectivity parameter as well. Instead of an overall reproduction of what he saw in a certain microscopic field, he first recorded very exactly in individual drawings the specifics of meaningful elements. This tight correspondence between drawing and object is evident in Fig. 1. But after compiling these specific individual depictions, the final image displaying the whole of a certain cell as a three-dimensional structure was not a one-to-one correspondence, but a composite made of a highly differentiated selection of these partial visualizations. The final images are the result of a complex procedure of visual assemblage (Fig. 2). This procedure was very much criticized by the scientific community. Indeed, many reproached the artificial character of these composed images. Cajal’s response to this criticism shows that the strategy of assemblage was part of a programmatic attitude. He argued: The only liberty taken was the artistic grouping of cells from various serial sections. It was unquestionably necessary to make use of this device. Otherwise, a very large number of figures would have been necessary, resulting in an essential loss of exact and clear representation. 

Cited in De Felipe & Jones 1992, 243-244, n. 41, 43 from a paper on the visual cortex of the cat (Santiago Ramón y Cajal, J. Psychol. Neurol. 29 (1922): 161-181, German translation of Santiago Ramón y Cajal,

134

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look



Figure 1 A: Photomicrograph from one of Ramon y Cajal’s preparations of the postcentral gyrus of a newborn child, showing a layer V pyramidal cell impregnated by the Golgi method (copyright: Instituto de Neurobiologia ‘Ramòn y Cajal’, Madrid). Figure 1 B: Drawing by Ramón y Cajal showing a Golgi-impregnated deep giant pyramidal cell of the postcentral gyrus of a child of 30 days (reproduced from De Felipe & Jones 1992, fig. 1).

Cajal called this strategy of composition “combining the images.” This shows that Cajal was not looking for a momentary, overall registration of the whole of an observation; he aimed much more at optimizing the results of observation, namely by a critical extraction of visual data from multiple sessions of recording and by “uniting in a drawing the elements collected in several sections of the same region.” In doing so, he was able to create a generalized visual reconstruction of the overall organization of the three-dimensional system under investigation. Combining the images in order to improve their power of evidence did sublimate the content of the individual drawings; in fact, eventually, this imaging process showed not the actual structure, but the conclusions the microscopist drew from the multiplicity of the structure’s forms and from their mutual relations.





Arch. Neurobiol. 2 (1921): 338-362.) Postscript of a letter sent by Ramón y Cajal to his pupil Fernando de Castro on 19 July 1927, see Sotelo 2003, fig. 7. ibid.

135

Erna Fiorentini



Figure 2: The organization of a folium of the cerebellar cortex. Drawing by Santiago Ramón y Cajal (copyright: Instituto de Neurobiologia ‘Ramòn y Cajal’, Madrid).

Moreover, this strategy of combination allowed Cajal to elaborate visual evidence sequentially and to rearrange the images of individual structures according to possible new insights about their organization in the overall environment. In this sense, Cajal’s combined drawings have been rightly considered “imperishable sketches [that he] was able to update on the basis of future studies” (Sotelo 2003, p. 74). Indeed, Cajal was convinced that the good drawing, and the good microscopic preparations are scientific documents that preserve their value indefinitely and whose revision will always be advantageous, whatever the interpretations they give rise to. (Ramón y Cajal 1899/1904, Preface to Vol I.)

Many of these imperishable sketches in fact underwent modifications over the years, depending on when Cajal reconsidered them. In describing these variations, the “styles” of Cajal’s drawings have been spoken of – using a notion of style that is very art-historical and that refers to the line movement and the treatment of masses inside the image (Pérez de Tudela Bueso 1987).

136

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look



Figure 3: Drawings by Ramón y Cajal showing cells in the deeper layers of the visual cortex of the cat. A: published in 1899 (Cajal 1899, fig. 18); B: published in 1921 (Cajal 1921, fig. 11).

The different fashion of Cajal’s styles is manifest in Fig. 3. Both drawings represent the deeper layer of the visual cortex of a cat and come from two publications of Cajal’s: the first was a more general study of all sensorial spheres (the Flechsig centers) of humans and of several animals, including the cat (Ramón y Cajal 1899); the second resumed the discussion and the results of the older study and specifically investigated the visual system by means of the cat’s visual cortex, because of its similarity to the human (Ramón y Cajal 1821). With that, Cajal wanted to define the conformation of the neurons and the behavior and the ending of the axons precisely, in order to discriminate between the fundamental and the secondary textures of the visual tissue and to draw conclusions about visual function in humans. If in 1899 Cajal represented the deeper layer of the visual cortex in a very stylized, linear, and flat way, in 1921 he improved the image by visualizing more elements, and for this he used several colors, changing shadows, and applying different levels of focus. Of course, an accomplished draughtsman like Cajal would have been able to visualize all of these structures more threedimensional in the earlier image as well, for instance by shading with pencils of different degrees of hardness or using pen and ink. So it is likely that the linearity of the first image derives primarily from a focused view of a special structure, which was purposely brought to the fore in the drawing in this particular way. Conversely, the later sketches seem to represent something that Cajal had not seen, or more likely had not looked for, before; so the new sketch not only reinterprets, but also reshapes and enriches the first, and this difference is evidence of a changing view of the problem. As has been pointed out, one reason for this difference was undoubtedly the demands of publication (De Felipe & Jones 1992, p. 242). In the earlier publication, the illustration technique was based on lithographs, for which, incidentally, Cajal made the photoengraving himself. Here

137

Erna Fiorentini

fine, clear lines were essential for conveying the minute structure of the arborizations, but this technique was inadequate to transmit depth. Conversely, when Cajal wrote the later paper, printing practices had improved, and by using the half-tone reproduction technique, he was able to adapt the original drawings, which were very complex, for printing. They had been made with a combination of pen and ink, pencil, as well as wash and brush work that allowed a better visualization of mass relationships (De Felipe & Jones 1992, p. 242). These examples show that drawing was indeed central to Cajal’s imaging practices. Nonetheless, in addition and as a complement, he applied an astonishing eclectic spectrum of methods, ranging from unaided sketching via the camera lucida to the abdication of vision in favor of automatic techniques such as photomicrography. However, although he used all these options ad hoc for visualization, he was not fond of optical instrumentation for drawing. Cajal’s opinion was that [...] one must not build up hopes about the advantages of these apparati. The camera lucida, even when one is accustomed to its use by long practise, is only useful to fix the contour of the principal objects: any labour of detail must be done without the aid of that instrument, which has, in addition, the inconvenience of dazzling the delicate details. (Ramón y Cajal 1889, as cited in De Felipe & Jones 1992, p. 242)

Although Cajal had a long and advanced experience with photomicrography, he did not deem it an adequate method of visualization. [I do] not consider these photographs technically perfect at all, not even passable. In spite of the long experience that we have in this sort of work, it has not been possible for us to contravene the inexorable laws of optics. It is well known that when a section is thick and contains [cells] disposed in several planes and oriented in diverse ways, little is achieved by trying to focus on one group of cells; the images of those that are situated above or below and not in focus project shadows that disturb the purity and sharpness of the image. (Ramón y Cajal 1926, as cited in De Felipe & Jones 1992, p. 245)

It is no wonder, then, that Cajal always emphasized the superiority of reproduction by freehand drawing as the best procedure when one has some habit and liking for [artistic] painting. The first condition of the microscopist drawer [sic] is to know how to see and interpret what he sees; for that [reason] the artist and the microscopist cannot be separated. (Ramón y Cajal 1889, as cited in De Felipe & Jones 1992, p. 242)

Cajal’s drawings are highly sophisticated representations. However, they do not reproduce, but rather they abstract three-dimensional vision. They provided him a foundation to assess new considerations about the structure in question. His images do not really convey naturalistic threedimensional impressions, but rather the basic structures that are necessary to understand spatial relationships among individual elements. Take for instance the example in Fig. 4. Here, what generates depth is the superimposition of two-dimensional representations of the axonal terminal fields. It is this superimposition, and not the three-dimensional effect of these structures, that allows the viewer to infer the close correlation between them. What Cajal did was actually to put the observed three-dimensionality into a reliable, nearly three-dimensional format, in the form of sophisticated two-dimensional drawings that should be able to transmit the information evinced from the three-dimensional specimen.



For an excellent account of the juxtaposition of drawing and photography in Cajal’s visualization practices see De Rijcke 2008.

138

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look





Figure 4: Drawing by Ramón y Cajal showing nerve cells from the human cerebellum (copyright by Instituto de Neurobiologia ‘Ramòn y Cajal’, Madrid).

So we can surely define Cajal’s drawings as graphical symbols that are able to convey the best information about the object after the primary visual information has been processed by selecting, distinguishing, and reassembling the structures observed. In Cajal’s notion, it was necessary to pursue the visualization of the structure in both the object and the image concomitantly. The staining of the specimen that made the actual structures visible depended on the features to appear in the drawing, and vice versa. This visual interplay did not necessarily result in any exactness of reproduction, but rather led to a selective visualization of spatial relationships via the judgment of the observer. Thus, Cajal’s principles of microscopic visualization organize the passage from the material to its image through the filter of knowledge. Of course, Cajal’s selective and synthetic methodical virtuosity was instrumental to the construction of images. But the conviction underlying Cajal’s approach also aimed at the assertion of a controlled visual curiosity that should be effective in creating – and improving – a mental image of the structure hidden in the cells. In Cajal’s logic, methods of visualization are not only means to improve the visibility of the object. Even more, they should enhance the ability and sensibility of the observer to distinguish the contents in the prepared object and, moreover, coordinate eye and hand to reach the best and most informative visual result. The investigator guiding his visual habit should thus start “with the attitude of a fascinated spectator” (Ramón y Cajal 1988, p. 252) to unveil the mysteries of the cells. In Cajal’s practices and convictions, the “adequate point of view” from which to do this was found between the insights in the real structure and the stepwise image construction as an optimization of the singular visual records. The final images, in their turn, served as a basic instrument to convey the reasoning about the observed structures as well as the conclusions drawn from them. Finally, it was not by chance that Cajal returned his attentions as an histological investigator to the aesthetic discovery of “the elegant architecture shown by the cells and the layers” of the hippocampus and the fascia dentata, all structures, Cajal writes:

139

Erna Fiorentini

adorned by many features of pure beauty of the cerebellar cortex. Their pyramidal cells, like plants in a garden as it were, a series of hyacinths, are lined up in a hedge which describes graceful curves. (Ramón y Cajal 1988, p. 415)

In fact, his faith in the power of beauty belonged to Cajal’s construction of the “adequate point of view” for histological observation and visualization. This faith triggered his confidence in the visual, because the beauty of natural objects was to him “not only of the intellectual order” (Ramón y Cajal 1988, p. 414). Aesthetic sensibility during histological observation and judgment was critical for Cajal, an indispensable factor inducing scientific curiosity and generating knowledge. Indeed, he was convinced that however poor and incomplete may be the objective vision of the scientist, he will even be able to affirm that the illogical and anti-aesthetic elements in the scientific conception of a phenomenon necessarily imply error or misunderstanding in the idea of the investigator. (Ramón y Cajal 1988, p. 414)

So, Cajal surely adhered to the old maxim that “nature is the work of a divine artist,” which the scientist ought to discover and to depict grasping its unique features and beauty in order to disclose its secrets. To this end, however, through his practices of visualization, he also implemented Benedetto Croce’s idea that “art and science… coincide in one aspect, the aesthetic aspect, [and that] every scientific work is also a work of art” (Croce 1992, p. 27).

Discussion Dröscher: Only a comment on the Golgi technique. It was an extraordinary technique, I think, for imaging, at least for three points. A reason was it was obviously a black reaction, so we can see it, by the contrast. And another was that it colors the whole nerve cell. In the cell, it colors only some of the cells of the brain, and we don’t know exactly why some cells are colored, and some are not. As it colors all cells in the tissue, you won’t see anything, apart from Golgi. Cajal was the first person to see the potential of this technique. Schadwinkel: It is the first time that I heard about that. Can you show the brain cortex again, yes, this one. Thank you. I found amazing – in the right image – that he took drawings with the pencil and then ink, the old-fashioned way, as I also very often do. When you look at the shadows behind it, I think, he took the brush and ink [diluted] with water and so he gets deep images, something like a 3D imaging. It’s quite amazing, with such a simple effect. He has the typical drawing technique of an artist – something is clear, something is far away. This is interesting in a scientific drawing. Fiorentini: The constraints of the publication technique become evident in this example on the left side. In the graphic four-colour technique you had to superimpose four times singular lines. And so you needed crisp, clear lines in order to show all the structures. Later, in 1921, the technique had improved and with the so-called half-tone technique you were able to reproduce the original drawing one-to-one in the publication – in the case of the image on the right side, unfortunately, not in color. You can make the various effects of the drawing visible at once, you don’t have to simulate more structures. So it is the question of the degree to which you are able to communicate the visual content of the same drawing through the printed form. It differs depending on the graphic technique you use. Johns Schloegel: Interesting, and where is he writing about his visual graphics? Fiorentini: In his publications, it is very difficult to find anything about his methods of graphic translation. It is significant that the description on the “combined images” was written on the

140

“Placing Oneself at an Adequate Point of View”. Santiago Ramón y Cajal’s Drawings and the Histological Look

backside of a letter he wrote to an acquaintance of his. But if you look to his publications, it is very rare [that] he talks about his graphic methods, he is not really explaining why he is doing something in a certain way. He speaks about the advantages of drawing, not necessarily about how to publish them. It was more important to him to preserve the original material. So, his preparations are so perfect that it is [still] possible to observe certain structures. And he was convinced that he could use these preparation and the drawings he made of them in order to develop new ideas later in time.

141

Erna Fiorentini

Bibliography Croce, Benedetto. 1992. The aesthetic as the science of expression and of the linguistic in general (1902). Translated by Colin Lyas. Cambridge: Cambridge University Press. De Felipe, Javier & Edward G. Jones. 1992. “Santiago Ramón y Cajal and methods in neurohistology.” Trends in Neurosciences 15.7: 237-246. De Rijcke, Sarah. 2008. “Drawing into abstraction. Practices of observation and visualisation in the work of Santiago Ramón y Cajal.” Interdisciplinary Science Reviews 33(4): 287-311. Pérez de Tudela Bueso, Maria Angustia. 1987. El grafismo, base fundamental para el científico: D. Santiago Ramón y Cajal. Tesina de convalidacion. Madrid: Facultad de Bellas Artes de Madrid. Ramón y Cajal, Santiago. 1889. Manual de Histologia Normal y de Tecnica Micrografica. Valencia: Aguilar. Ramón y Cajal, Santiago. 1899. “Estudios sobre la corteza cerebral humana.“ Revista trimestral micrográfica, 4: 1-63. Ramón y Cajal, Santiago. 1899/1904. Textura del Sistema Nervioso del Hombre y de los Vertebrados. Madrid: Moya. Ramón y Cajal, Santiago. 1921. “Textura de la corteza visual der Gato.” Archivos de Neurobiologia II, 4: 338-362. Ramón y Cajal, Santiago. 1922. “Studien über die Sehrinde der Katze.” Journal für Psychologie und Neurologie 29, 1-3: 161-181. Ramón y Cajal, Santiago. 1926. “Démonstration photographique de quelques phénomènes de la régénération des nerfs.” Travaux du Laboratoire de Recherches Biologiques de l’Université de Madrid 24: 191-213. Ramón y Cajal, Santiago. 1988. Recollections of my life. New York: Garland. Ramón y Cajal, Santiago. 2001. “The Corrected Pessimist.” Vacation stories: five science fiction tales. Urbana: Univ. of Illinois Press, 122-168. Sotelo, Constantino. 2003. “Viewing the brain through the master hand of Ramón y Cajal.” Nature Reviews Neuroscience 4: 71-77.

142

The Cell as a Technical Image Matthias Bruhn

Complicated Vision This paper focuses on an episode from the history of botany around the year 1805, a phase that preceded the formulation of cell theories in the 1830s. At this time, the existence and universal character of cells was still in question, and the cell had not yet been defined as an isolated and selfcontained structure (cf. Dröscher 2008). However, due to the intensified discussion based on microscopic observation and the ongoing dispute between microscopists and those who doubted microscopic explorations, cells had become one of the prominent issues in physiological research by the late eighteenth century. It is commonly understood that the term “cell” was coined long before the cellular structure of bodies was accepted. Robert Hooke had observed antra or boxes in a small cork piece that he had engraved in his Micrographia of 1665; the boxes reminded him of chambers or honeycombs, and he assumed he was seeing vessels for the transport of plant sap. Hooke compared his observations to textile patterns, which he also reproduced among his illustrations. His metaphors of the architectural “cell” and the textile quality of “tissues” have survived, even though they have taken on different meanings concerning the object in question (Müller-Strahl 2004, p. 109). Hooke’s contemporaries Marcello Malpighi and Nehemiah Grew observed different sorts of channels, particles, and vessels, which Malpighi called utriculi (hoses, tubes) in his Anatome plantarum (London 1675/1679). Grew proposed that some zones of the tissue contained cells with solid, fiber-like walls. Both authors realized that the body of most plants consisted of recurring elements (fibers, vessels, cells), which would remain stable (unlike bubbles, optical phenomena a.o.) during the process of preparation and observation. Grew in particular began to imagine that all plants had a cellular constitution that was an aggregation of smaller units. However, cells remained the product of a complicated microscopical vision and microtomical preparation. At this scale, it was impossible to assert that there were cells at all. Lines or circles in the sample could result from clefts in the substance (like a crack in a window pane refracting light, i.e. a line that is itself immaterial), or from flattened tissue. On the other hand, it was understood that living bodies contained fluid, and that there were vessels (such as arteries and veins) communicating it. It was known from bodily experience that digestion and respiration permit the transport of substances, and that the exterior skin of a body (like the human one) could prevent blood from leaking while allowing the same body to sweat. Based merely on analogy, it could be postulated that microstructures such as vessels or cells would have functions similar to, for instance, arteries, and that internal diaphragms could serve as some sort of skin. But because there was no general definition of elementary phenomena such as humidity and no distinct relation among fluid, solid, and gaseous bodies, terms such as ‘sweat’ or ‘transpiration’ would have had different meanings at that time. The attempt to answer these questions systematically in the period around 1800 had consequences for the spatial and temporal definition of life and decay, motion and development, generation and morphology, and led to the differentiation of disciplines such as biology and chemistry, and to the final separation of fauna and flora.

143

Matthias Bruhn

One of the disputed observations in botany around the year 1800 was the distinction of structural elements, with a focus on the fascinating spiral vessels and their origin. In order to clarify the question, the Royal Society in Göttingen advertised their annual reward for an answer, asking for the submission of treatises in their Gelehrtenblatt, with a brief note that pretended to reflect the state of botanical discussion while instead being a somewhat uninformed melange of quotations. The motivation for the Göttingen contest remains a conundrum. The only apparent clue (besides the announcement and the final notice of the jury’s vote in the almanac) is a single sheet of paper preserved in the academy’s archive where the members of the academy gave their opinion on the texts handed in. Unfortunately, it documents a purely formal discussion among the colleagues (AdW 1805-06). Three texts were submitted, by Heinrich Link, a botanist, and by Carl Asmund Rudolphi and Ludolf Christian Treviranus, both physicians. At this time, Rudolphi was already a renowned anatomist and medical expert, who would later also be appointed, as was Link, director of the new Berlin university. All three essays were behind the times in terms of botanical knowledge, full of errors, and contradicted each other in many respects. However, all of them were accepted, which supports the idea that the commission could not judge their quality (Petz 1987: 15), as does the fact that the leading French researcher Charles François Brisseau de Mirbel – even mentioned in the announcement together with Bernhardi – was not invited to the competition. He protested harshly at having been ignored when he read the Gelehrte Anzeigen the following year. His letter to the academy, attached to a recent publication, caused no small amount of trouble among the members who did not know how to handle the issue. Chroniclers unanimously confirm that only Treviranus was partly right; some saw the entire contest as an expression of ignorance. Significantly, Treviranus was not awarded a prize, but received a citation (accessit), although he had at least understood that all vessels had a wall structure that was differentiated during the development of the plant. Rudolphi interpreted the cell wall as a single, but porous barrier; Link suggested that it was completely closed and thus faced problems in explaining the transport of fluid. Both of them reported the results of Mirbel (who had spent considerable systematic efforts to the study of phytotomy) and his argument with Bernhardi in a rather abbreviated way that did not really clarify their own argumentation, even if it could demonstrate the authors’ up-to-date erudition (in particular Rudolphi’s, see Hagner and Vesper 1991: 45).

Art History of the Cell A Göttingen artist named Christian Besemann executed the engravings for all three essays although the books appeared in different publishing houses and cities. His name is almost forgotten in art history and hardly mentioned in the literature on scientific illustrators, even though he seems to have produced plates for books in various fields (for Besemann see Kleineberg 1968; Appel 2003; Schulze 2004: Appendix). Thus, the competition not only provides interesting insights into an early stage of cell research that led to the understanding of the cell as a distinct unit of life and to cytology, but it also conveys the contribution of artistic practice to the conception and visualization of cells. In this respect, the cell may be called a ‘technical image’ in that it stands for the sum of different expertises that met in fields such as anatomy and microscopy and that blended scientific investigation and certain artistic styles of representation. As images had become the reason for the sustained success of microscopy books, for instance Hooke’s and Malpighi’s, professional illustrators gained attention among scientists who were

144

The Cell as a Technical Image

themselves usually able to draw but required assistants for the translation of drawings into woodcuts, etchings, or colored depictions of an equivalent quality. In the course of the seventeenth and eighteenth centuries, authors began to build their arguments increasingly on the basis of figures, sometimes referring to them exclusively. As a means of communication, images thus obtained further importance, with an impact on scientific thought itself as they could be in the center of an academic dispute.



Figure 1: Detail from Treviranus 1806, plate 1, figure 1. Copper engraving by Christian Besemann (photo Bruhn).

The Göttingen prize gives examples of this. As mentioned, it seems that only Treviranus fully understood that cells are not only constructive elements (or simply the spaces between elements) but kernels extending and filling the empty space around them during their development, creating vessels and fibers through functional differentiation. The first of the depictions at the end of his essay shows the parenchyma of a plant, in the manner of a developmental scheme from left to right, to illustrate this. His competitor Heinrich Link referred to this illustration and concluded: “Nach dieser Theorie müssten die Scheidewände der Zellen überall doppelt seyn.” [“According to this theory the diaphragms of the cells would have to be doubled everywhere.”] (Link 1807, p. 14) Link refused to accept this conclusion, as he refused to speculate at all, claiming in the introduction to his own treatise that all statements were to be purely descriptive. Yet he accepted Treviranus’s illustrations tacitly in such non-speculative descriptions: Da, wo die Zellen an einander stossen, bemerkt man oft einen doppelten Strich (s. Fig. 1), gleichsam einen Zwischenraum zwischen den Zellen. [At the point where the cells touch each other, one can often see a double stroke (see figure 1), almost an interspace between the cells.] (Link 1807, p. 13)

This seemingly banal description of double strokes was a remarkable statement. Involuntarily, it declared a drawn line (i.e., the German Strich, which derives from the language of drawing or painting) to be an entity, its depiction a piece of evidence. Still Link refused Treviranus’s conclusions, stating that his own, more accurate examination (genauere Untersuchungen) would disprove them. The relation of scientific and artistic practice became closer when universities in the German states and other European countries began to employ illustrators around 1800. The illustrators had a particular standing in academic life, and the ongoing specialization in science required a

145

Matthias Bruhn

corresponding specialization in their techniques, although their training followed independent rules. At this time, there was not yet a strict separation of fine arts and applied technology. Some of the draughtsmen and engravers became lifelong partners of researchers. Besemann was offered a post as a staff member at Göttingen university in order to counteract an offer to move to Saint Petersburg. Some years later, the embryologist Karl Ernst von Baer confirmed that he was traveling to Berlin solely to find a good engraver, as embryology demanded the highest working speed and rare skills: It is necessary to always have a draughtsman at hand, particularly to depict the ovule and embryos in their early stages, which decay too fast. For the copper plate it is also essential to have an engraver work directly under the supervision of the observer, for it is almost impossible to attend to corrections from afar. (Quoted after Koch 1981: 181f.)

Intersections Detailed cell and membrane structures and their function remained invisible for reasons of optical resolution; better lenses were in constant demand and the preparation of microtomes made slow progress (Bracegirdle 1986). Link et al. employed standard instruments from the Dollond family, with a magnification of 200 times; they were not yet able to profit from the better lenses that would be manufactured by Fraunhofer a few years later.



Figure 2: Detail from Rudolphi 1807, plate 4 (photo from Barbara Herrenkind, Humboldt University).

But the question would not be answered with the help of better lenses alone, as any optical microscope demands fine slices of a sample that are transparent enough for light to pass through and will display objects in a sagittal view only (as in X-ray photography). The ability to see surfaces, holes, or vessels instead of lines, dots, or circles was the result of an intellectual extrapolation capable of rendering spatial objects from mere lines. Such operations are rooted in a

146

The Cell as a Technical Image

longer history of spatial modelling as well as metaphorical thinking usually subsumed under the general term “visual culture” (for the role of metaphor in the biological, particularly the cytological, context cf. Maasen, Weingart and Mendelsohn 1995; Maasen and Weingart 2000; Otis 2000; Johach 2008). The scientific illustration reflected the progress and standardization of book printing with its accompanying conventions for the format and design of depictions (use of lines for frames or scales; ciphers and letters as identifiers for figural elements; spatial representation of objects in perspectives or the use of shadowing; or pairing of different views of objects, as is still used in modern drafting). Only on the basis of such traditions was it possible, for instance, to interpret a woodcut showing hatching (in German, Schraffur) as a representation of a shadow or a gray zone instead of fine lines as such. Being aware of the importance of such visualization techniques, Treviranus later decided to author a critical history of the botanical woodcut illustration (Treviranus 1855). The term culture may also indicate here that scientific observation is not a given datum translated into the language of fine arts and inoculated there with extra meaning. As people learn to deduce spaces from two-dimensional depictions, they also learn to interpret the planar microscopic image on the basis of certain experiences that are cultivated in arts and architecture. Seeing grains of starch, for example, where there are only dots is thus not an over-interpretation of the findings (see this detail from Rudolphi 1807, Pl. 4 where a series of cells is obviously depicted as a chain to convey the desired impression or description). The artists’ cultural education and manual training, their styles of visualizing, melded artistic vision with a broader culture of graphic recording and notation (for the art historical notion of “style” in the scientific context cf. Ginzburg 1998; Bredekamp et al. 2008: 36-47). In a figurative sense this melding is still present in any MRI scan or gene sequencing. It was strongly influenced by a doctrine of drawing as an elite technique of visualizing the invisible. The time around 1800 was not only an “age of drawing” in terms of intensified academic and public training or anatomical practice (see e.g. Kemp 1979; Bermingham 2000; Schulze 2004; Exh. Cat. Paris 2008), it was also the zenith of a linear style of representation that resuscitated the sixteenth-century academic doctrine of the disegno, which had declared the line intellectually superior to color, in that it uncovers structures of nature accessible only to the talented. In the long run, in combination with a classicist ideal of purity and clarity, this led to a style of precise outlines, contours, or silhouettes. Choosing an artist also implied preferring or adopting his personal style of depiction and his preference for certain printing techniques. Besemann was well-known for being an accurate, high-precision draughtsman. When a printed result had to be deutlich (the German word for “clear,” which translates verbatim into “interpretable”), his images realized the immaterial (or hardly visible) line with exact strokes and continous lines. Pen and ink would have permitted soft gradients, and techniques such as aquatinta and mezzotint had been available for a generation to translate the drawings into print, but they were expensive, and experts such as Besemann or his follower Thiele (who also worked for Treviranus in his later years) did not adopt them. On the contrary, they handled the engraving like a microtomical procedure. When asked to place an image of a diagonal cut of a twig on the plate next to the microscopic magnification in order to demonstrate the original size of the object, Besemann did not use a mere scale or circle but also denoted delicate details that went beyond the resolution of the paper fibre of the attached plates. The detail shown here (Rudolphi 1807, plate 3, figure 3) has a size of 1.5 millimeters and is almost invisible without a magnifying glass: a symbolic form for artistic championship.

147

Matthias Bruhn



Figure 3: Reproduced from Rudolphi 1807, plate 3. Copper engraving by Christian Besemann (photo Barbara Herrenkind, Humboldt University).

The old search for the finest possible line (as reported in an episode of the ancient painter Apelles, but also present in physical or mathematical discourses) took on a new connotation when challenged by the visual representation of membranes and cell walls, while microscopy, in turn, was the quest for the lines of nature by other means. Engraving the microscopist’s drawings in this manner might even have supported the idea of the isolated cell as an independent system, as the depictions remained architectural, mechanical, or textile-like. Recent research, for example, at the European Molecular Biology Lab, or the visualization of the cell in modern textbooks; in film and video; or in diagrammatic, stereoscopic, biophotonical and nanotechnological techniques still suggests the existence of distinct structures, walls, skins, channels, and fibers. Cell research has a pictorial history with identifiable styles. Images of cell microscopy display a mixture of intellectual and technical styles and skills, of media and constraints, of collective ways of seeing, formal traditions and iconographies. A stylistic analysis of botanical illustrations is a major objective of this study. It is not to separate the image from its context nor does it imply any explanation, interpretation, or evaluation. It is rather based on the fact that images do convey significant temporal changes of representation that give access to another aspect of scientific history. It also raises the historiographical question of “tradition” in the field of scientific images as it suggests, for instance, searching for the individual drawings made by the participants of the

148

The Cell as a Technical Image

Göttingen competition, their communications with the engravers or the negotiations with publishers concerning the price of black-and-white or even colored prints. These art historical components of scientific development reveal a cultural heritage of their own worthy of further interdisciplinary investigation.



Figure 4: Detail from figure 3 (photo Barbara Herrenkind, Humboldt University).

Discussion Klemun: I want to go back to the image of Rudolphi, with the image of the cell. If you look at this line, it is a chain, and I would like to ask you – how can you explain that, because it is a special structure? Bruhn: I remember that it is called a series or an agglomeration of cells that creates a wall, but when I looked at the structures there was something that clearly ressembles a chain of rings. Either there were instructions of the observer Rudolphi himself who told the artist “I see something that reminds me of rings” (and somebody has then executed it in this way), or – and that is my suggestion – Rudolphi made drawings himself when looking through the microscope, and the person who executed the etchings misinterpreted what Rudolphi saw because he could not doublecheck by looking through the microscope. Sometimes you can see that Besemann has made a lot of etchings under the direct supervision of anatomists in Göttingen, i.e. he was directly observing what he was doing and could verify it. But in this case problems occurred due to the fact that the authors of these three books were in three different places: Rudolphi was in Greifswald, Treviranus in Bremen, and Link in Rostock at that time. All materials had to be mailed, the manuscripts along with the drawings. Moreover, the authors were expected to print the treatises on their own account. The Göttingen files contain letters by Rudolphi who complained about the fact that he was expected to use his award for the printing costs. The academy did not seem to really care for such matters, and all that was protocolled during the discussion were formal aspects like “Is it possible that we accept the treatise handed in in German, whereas the regulations said it has to be in Latin, and if we accept that, somebody could also submit a text in English.” And Blumenbach replied, “No, at this time we accept it in German.” Thus we have to examine the archives and collections of the authors and see what is left.

149

Matthias Bruhn

Wall: You said at one point that the kind of line used here comes out of a mannerism style. Can you say something about that? I’m familiar with some oil paintings of that time, but not so much with the theory. Bruhn: There is almost a “dogma of the line” emerging in the sixteenth century at Italian art academies (a period often termed “mannerism”) which dominates the discussion of art and its intellectual quality: A “true” artist is the one who is able to see or imagine the lines of nature, i.e. the essential elements or structures, which becomes an argument for the excellence of drawing as a technique among artists and academies. This is an older idea, that goes back to medieval times and the discussion of the dichotomy of color and line. Second, there is also the intention to support a certain art movement; this was the case with Giorgio Vasari, who promoted the Florentine way of painting as a model for the primacy of a linear style opposed to the Venetian “colorist” way of painting. Later, this led to the academic understanding of the beau idéal present in nature. The outline, the depiction of the silhouette or Gestalt, is now claimed to be a representation of truth, while color is only additional, which is strange because color is also constitutional, as every painter knows from his own practice. Fiorentini: I just want to point out this is true, for example, for the dispute between colore and disegno, the idea of distinguishing appearances and contents, the line as the mimetic idea of the reality, the idea that intends [to convey] “there are things inside that are not visible,” whereas color is the appearance, so to speak, of the naturalistic impression you have. Bruhn: The ability to outline nature stands for an ability to see “through” things, a diagnostic view, which on the other hand is based on a certain manual skill, the art of drawing, the technique. As Barbara Wittmann and others have recently shown, drawing combines the physical act, that can be trained, with an act of thinking, of education. In Humboldt’s terminology this is Bildung. In a similar way Horst Bredekamp, by looking at Galilei’s work, puts it as “drawing is the moment of grasping the thing.” Yet what I wanted to show in the first line is that there is also some sort of “theory of the line” inherently in the discussion of what a “membrane” is.

150

The Cell as a Technical Image

Bibliography AdW 1805-06 : Archive of the Göttingen Akademie der Wissenschaften, Scient 1832 Fasz 9, 1805-06, Inv. VA6b a, GG 2. Appel, Thomas. 2003. Biographische Ergänzungen zu dem Göttinger Zeichner und Kupferstecher Christian Andreas Besemann (1760-1818). In: Göttinger Jahrbuch 2003, 51, pp. 27-48. Bermingham, Ann. 2000. Learning to Draw. Studies in the Cultural History of a Polite and Useful Art. New Haven: Yale University Press. Bracegirdle, Brian. 1986. A History of Microtechnique. The Evolution of the Microtome and the Development of Tissue Preparation. 2nd ed. Lincolnwood, Ill. Bredekamp, Horst, Birgit Schneider and Vera Dünkel (eds.). 2008. Das Technische Bild. Kompendium zu einer Stilgeschichte wissenschaftlicher Bilder. Berlin: Akademie Verlag. Comar, Philippe (ed.). 2008. Figures du corps. Une leçon d’anatomie aux beaux-arts (Exhibition Catalogue École nationale supérieure des beaux-arts, Paris. 21.10.2008-04.01.2009), Paris: ENSBA. Dröscher, Ariane. 2008. “Was ist eine Zelle?”. Edmund B. Wilsons graphische Antwort. In: Kaasch, J. and M. Kaasch (eds.): Natur und Kultur. Biologie im Spannungsfeld von Naturphilosophie und Darwinismus (Verhandlungen zur Geschichte und Theorie der Biologie, 14). Berlin: VWB, 191-201. Ginzburg, Carlo. 1998. Style as Inclusion, Style as Exclusion. In: Jones, Caroline and Peter Galison (eds.): Picturing Science, Producing Art. New York/London: Routledge, pp. 27-54. Hagner, Michael and Elisabeth Vesper. 1991. Einige Nachrichten über die Bibliothek des Anatomen und Physiologen Karl Asmund Rudolphi. Wolfenbütteler Notizen zur Buchgeschichte 16(1): 41-62. Johach, Eva. 2008. Krebszelle und Zellenstaat. Zur medizinischen und politischen Metaphorik in Rudolf Virchows Zellularpathologie. Freiburg i. Br.: Rombach. Kemp, Wolfgang. 1979. “... einen wahrhaft bildenden Zeichenunterricht überall einzuführen”. Zeichnen und Zeichenunterricht der Laien 1500-1870. Ein Handbuch. Frankfurt am Main: Syndikat. Kleineberg, Günter. 1968. Christian Andreas Besemann, sein Werk und seine Zeit. Zum 150. Todesjahr des Künstlers. Göttingen (Offprint from: Plesse-Archiv 3, 1968, pp. 9-30). Koch, Hans-Theodor. 1981. Karl Ernst von Baer (1792-1876): Korrespondenz mit den preußischen Behörden. In: Wissenschaftliche Beiträge der Universität Halle 39, pp. 169-191. Link, Heinrich Friedrich. 1807. Grundlehren der Anatomie und Physiologie der Pflanzen. Göttingen: Justus Friedrich Danckwerts. Maasen, Sabine and Peter Weingart (eds.). 2000. Metaphors and the Dynamics of Knowledge. London a.o.: Routledge. Maasen, Sabine, Peter Weingart and Everett Mendelsohn (eds.). 1995. Biology as Society, Society as Biology: Metaphors. Dordrecht a.o.: Kluwer.

151

Matthias Bruhn

Müller-Strahl, Gerhard. 2004. Der biologische Zell-Begriff. Verwendung und Bedeutung in Theorien organischer Materie. Archiv für Begriffsgeschichte 46: 109-136. Otis, Laura. 2000. Membranes. Metaphors of Invasion in Nineteenth-Century Literature, Science, and Politics. 2nd ed. Baltimore: JHU Press. Pelz, Willy. 1987. Zellenlehre. Der Einfluss Hugo von Mohls auf die Entwicklung der Zellenlehre (1944). Reprint Frankfurt am Main a.o.: P. Lang. Rudolphi, Carl Asmund. 1807. Anatomie der Pflanzen. Berlin: Mylius. Schulze, Elke. 2004. “Nulla dies sine linea”. Universitärer Zeichenunterricht – eine problemgeschichtliche Studie, Stuttgart: Franz Steiner. Treviranus, Ludolph Christian. 1806. Vom inwendigen Bau der Gewächse und von der Saftbewegung derselben. Göttingen. Treviranus, Ludolph Christian. 1855. Die Anwendung des Holzschnittes zur bildlichen Darstellung von Pflanzen. Leipzig (Reprint 1949, Utrecht W. de Haan). Winter, Irene J. 1998. The Affective Properties of Styles: An Inquiry into Analytical Process and the Inscription of Meaning in Art History. In: Jones, Caroline and Peter Galison (eds.): Picturing Science, Producing Art. New York/London: Routeledge, pp. 55-77.

152

Movement and the Creation of Images Silver Rattasepp

It may perhaps be worthwhile to reflect briefly on the relationship between perception and the nature of pictorial images. More precisely, it may be worthwhile to consider the nature of a simple image, such as a line drawing, and its creation by an organism whose perception of the world depends fundamentally on movement. Traditional approaches depict perception within the paradigm of eye-as-camera and stimulus-response theories, and consequently pictures as still representations or snapshots of the visual field. If, on the other hand, we take seriously the enactive approach to perception, according to which perception depends fundamentally on movement, and add to it further elaborations on the nature of organisms as constantly on the move both temporally as well as spatially, the nature of the act of creating images may need some rethinking. I will suggest that the “primary act” of creating a visual image is itself, in its core, an act, a movement. In what follows, I will be mainly reflecting (1) on the nature of perception as conceived within the so-called enactive paradigm (Varela et al 1993; Noë 2004), (2) on the nature of the growth and development of organisms within the developmental-systems framework (Oyama 2000 [1985], Oyama et al 2003), and (3) on the nature of cognition in the “embodied, embedded mind” approach (Clark, Chalmers 1998; Gallagher 2006). I will then briefly reflect upon traditional premises of psychological research on perception and contrast them with some of the conclusions that may be drawn from this triumvirate of “organisms on the move” approaches. Finally, I will comment on what conclusions may be drawn from all of this regarding the tendency of human beings to draw up pictorial images, especially line drawings. “Human life,” it has been argued, “is based on and in movement” (Thrift 2008, p. 5). This “leitmotif of movement” (ibid.) will be my guide in what follows. There are, roughly and increasingly metaphorically speaking, three interconnected ways of conceiving organisms – and in the present instance, primarily human beings – as being thoroughly “moving,” that is, as being fundamentally process-based. The most straightforward case is perception. Gibson has argued that perception can only take place as long as an organism moves with respect to the underlying structure of edges and surfaces (Gibson 1986 [1979], pp. 65-92), specified by what Gibson called the ambient optic array. Visual invariants – literally the things that we ultimately perceive as objects, but which are specified in the array by their surfaces, textures, edges – appear as an organism moves itself in relation to what is perceived. This results in several distinct yet stable changes in the visual field, which, to use an adage from Gregory Bateson – that it is a difference which makes a difference – makes us perceive our surroundings (Bateson 2000, p. 459). Alva Noë has argued that since knowledge on the part of the perceiving organism about these characteristic patterns of movement or visual invariants is necessary, visual perception requires “certain kinds of bodily skills – for example, a basic familiarity with the sensory effects of eye or hand movements” (Noë 2004, p. 2). Thus to perceive means to move around, as a knowing agent, in the environment. “Perception is a way of acting. It’s not something that happens to us, or in us. It’s something we do” (Noë 2004, p. 1). Now if to perceive is to know, it may also be surmised that one has to test and thereby learn these relevant patterns of changes in the visual field. This is most likely a reciprocal process. On the one hand, in order to come to know how the surfaces and textures of the visual world operate, one has to be in constant and close engagement with the world, picking up its invariants and

153

Silver Rattasepp

stabilities; thus we learn through the act of detecting such patterns. But on the other hand, as we become more skilful as a result of this constant attentive engagement with the world, we will learn to detect new patterns and pick up on things that we previously could not. Through this fine-tuning of perceptual skills, meanings immanent in the environment – that is in the relational contexts of the perceiver’s involvement in the world – are not so much constructed as discovered. (Ingold 2000, p. 22)

From this it seems to follow that it is not just movement, but self-movement that is required for a fully developed three-dimensional vision. A striking demonstration of this has been provided by Richard Held and Alan Hein. Conducting their research on kittens, of whom some were able to move themselves and others were not, yet who were still moved in relation to their surroundings, they noticed that several aspects of the latter group’s behavior were considerably degraded, such as visually guided paw movement, discrimination on a visual cliff, etc. They concluded: Self-produced movement with its concurrent visual feedback is necessary for the development of visually-guided behavior. Equivalent, and even greatly increased, variation in visual stimulation is not sufficient. (Held and Hein 1963, p. 875)

Thus there is experimental evidence that visual perception requires not only movement in relation to the surroundings but, fundamentally, the capacity to move oneself. Once again it may be tentatively concluded that perception can only be a property of a self-moving, knowledgeable agent. (On the ethical aspect of experimenting on kittens by tying them to a carousel for extended periods of time to prevent them from moving freely about, the authors at least note “the rapid recovery of function […] once given their freedom” from their “rather mild conditions of deprivation” [Held and Hein 1963, pp. 875-876].) The second, and more metaphorical conception of organisms as moving sees them “moving” in time – that is, developing. Once again, Gibson has argued that perception amounts to a practice and “the education of attention” (Gibson 1979, p. 254) – an attunement of the sense organs, during the process of ontogeny, to the invariants of the surrounding environment. 1 Thus any sense organ – or rather, perceptual system, since the activity of perception is not reducible to a specific lump of organic matter – can only began to function properly if it can develop in a field of relationships constituted by environmental invariants and the internal developmental resources of the growing organism, resulting in a particular coupling of environment and organism, thereby rendering perception proper possible. Thus development – and this includes the development of perceptual systems – is constituted over time from myriad inputs that are not limited to those bound by the skin. The life cycle of an organism is developmentally constructed […]. It comes into being through interactions between the organism and its surroundings as well as interactions within the organism. (Oyama et al. 2003, p. 4)

This directly leads us to the third and the most metaphorical sense in which organisms are constituted by movement – the impossibility of reducing perceptual and cognitive activities down to a skin-bounded entity called the “organism,” set over and against the background of its environment. For an organism thoroughly and inescapably coupled with its environment, perception and cognition flow freely across the organism’s boundary. As they move about their environment, perceivers make free use of certain aspects of the environment that work in a predictable way – in other words, they rely on those parts of the environment that they can lean on as supports for further activities. 

See especially (Ingold 2001) for a thorough explication of this idea.

154

Movement and the Creation of Images

One particularly interesting way this happens has been called “cognitive offloading” (Clark 1997, p. 94), referring to the tendency of human beings to make use of the environment to make their cognitive operations easier. A simple example would be the use of pen and paper as aids in mathematical calculation. Another telling example is provided by the computer game Tetris, which most readers are hopefully aware of and have perhaps even played. In the game, the player has to align differently shaped pieces by rotating them so that they fit together without any gaps. When you observe people playing Tetris, you notice that they immediately start rotating the pieces the very moment they appear, in order to find a suitable place for them. Now imagine trying to play this game without rotating the pieces, so that you have to find a proper place for a piece solely in your mind before you are allowed to move it. It becomes immediately clear (and everyone is welcome to try it out themselves) that this is quite difficult, and as the speed of the game increases, becomes next to impossible. Rotating the pieces in real-time, in the world, offloads some of the cognitive resources that must be used to find a suitable place for the piece – they are “actions whose purpose [is] to reduce inner computational effort” (Clark 1997, p. 66). Thus cognition is not something that happens “in the head,” nor even something that happens in the body, but is rather a relational field of activity that freely crosses the boundaries of the brain and the body, “forever leaking out into its local surroundings” (ibid, p. 82). What we may conclude from all of this is that organisms – and human beings especially – are skilful actors constantly on the move in the environment, while at the same time having been built up over time from a large number of different developmental resources, all of which have some sort of a constitutive effect on the various parts of the organism. To present the full impact of these approaches, it may perhaps be worthwhile to contrast this with older, admittedly rather obsolete ideas from the psychology of perception, such as the concept that the visual system works something like a camera. According to this perspective, visual perception can be conceived as a constant flow of information into a passive sense organ, which constructs sequential images out of this flow. The perceiver is passive, and cognition amounts to the construction of internal mental representations out of the constant influx of sense data. Thus the mind operates rather like a “mirror of nature,” as Rorty famously called it in his classic criticism of the idea (Rorty 1979). According to this perspective, viewers themselves have almost no deliberate, constitutive impact on what is perceived, and any visual image, from line drawings to photographs, can be imagined as a snapshot of the input flow. If this is the case, one is naturally led to two conceptions of what would constitute the best images to be used in psychological research on perception, those that would be the most “natural” for human visual perception, whether they are construed as the fundamental constitutive elements or the entirety of the visual field: simple geometric shapes and photographs. Simple geometric shapes are frequently used in research, and they seem to be considered the “primitives” of visual perception. Subjects are often presented with them while being attached to special apparatuses that hinder their movement. Thus the very set-up of such experiments precludes any impact of movement on perception. Photographs may seem “better” in a sense, but whereas geometric shapes are usually displayed to stationary observers, photographs are themselves petrified in time and represent a fixed, unmoving point of observation. Thus they cannot be explored by the perceiver in the sense of being able to move about the image in order to perceive it from different perspectives and to gain further, novel information from it. But the ability to move around and take new looks from different perspectives at a given object is precisely what perception and cognition are fundamentally about. The experimental set-up where all movement on the part of the perceivers is arrested and the set-up where perceivers are shown static photographs make sense only when the paradigm of perception follows the stationary

155

Silver Rattasepp

camera model. This error is further exacerbated by the stimulus-response theory, which, from the very start and by definition, precludes the importance of self-actuated movement in perception. Gibson also notes (Gibson 1979, pp. 256-257) that it is the aspects of exploration and selfmovement that are distinctly absent from even the most vivid hallucinations and dreams. The inability to explore, to make “reality checks” by moving around an object, is characteristic of dreams, not of being engaged with the world, and thus the “criterion for real versus imaginary is what happens when you turn and move” (ibid, p. 257, emphasis in the original). Thus making use of line drawings shown to an otherwise immobile subject, or of petrified photographs, is more akin to studying a person in a dream world or in the grips of hallucinations. What then would be the nature of a visual image for such a creature constantly on the move? Perhaps somewhat paradoxically, the “primitive” of depiction, the most basic image, or its most basic constituent element, would be the activity of tracing a line, the act of drawing on a surface – a coordination of the eye-head-hand system over time. A trace, a movement, a co-ordination – an act for an active creature, so to speak. In this case, a prototypic image would be a line drawing: a result of delineation of new invariants on the surfaces encountered in the environment. As Noë eloquently notes: It is making pictures – the skilful construction of pictures – that can illuminate experience, or rather the making or enacting of experience. Picture making, like experience itself, is an activity. It is at once an activity of careful looking to the world, and an activity of reflection on what you see and what you have to do to see (Noë 2004, p. 179, emphasis in the original).

And as this very collection of texts presented at the workshop indicates, line drawings are still – despite the most sophisticated methods of depiction and visual analysis – something nearly everyone reverts to at some time or another. There are obvious pragmatic reasons for this, such as the ease with which they can be created and their clarity, yet these are only further proof that for human beings, line drawings are “close” to them, something they can comprehend more readily than complex images – more readily even than photographs, which are purportedly one-to-one correspondences with the visual field! Thus we may notice that in order to demonstrate or to indicate, researchers tend to display the best image possible – such as, in the present workshop, 3D depictions of embryos, cells, or brains. But when there is a need to explain, to explicate in detail, it is line drawings that are used. Finally, it should also be noted that whereas all pictures are naturally still, it is the line drawing elements – arrows – that are almost invariably used in order to depict movement. Frequently, arrows are even superimposed on photographs and other complex high-quality images. One could perhaps speculate that for such a moving organism as the human being, for whom creating images is fundamentally about the act of drawing traces on surfaces, arrows, as the most elementary form of such traces, would be most readily understood as indicating movement. A line, “whether traced in the air or on paper, whether by the tip of the stick or the pen, […] arises from the movement of a point that […] is free to go where it will” (Ingold 2007, p. 73).

Discussion Perini: There are two things that I really liked about your talk: the comment about the importance of cognitive off-loading and the kind of role images can play with that, and also the very intriguing idea of thinking of perception as navigating, as moving around the environment, and so how many environments are involved in perception. Now, what I didn’t like about the talk was the quote “the world is its own best representation.” I hate that because it is not – the world is full of all sorts of interesting, complicated details, but it is lousy as a representation. You can’t communicate with it, you cannot point to the world and have anyone understand what you have

156

Movement and the Creation of Images

in mind. So when it comes to sciences, it is pretty much useless, so the scientific representations – as all visual representations – are selective, and selectivity can be really useful. It can be carried to extremes and then you are getting these schematic line drawings, but even more a traditional pictorial representation is selective in some ways – like black-and-white photographs don’t represent color; even a regular photograph only represents objects from one perspective, leaving out information from the back. You don’t get the kind of information that you would get when walking around the object. I was wondering: is there another way to put the two together without requiring that images be successful, that they have to be somehow like an object in the world without all of its properties? We find a useful role for images as part of our environment, as something that we experiencing and we can offload to, but it is still selective. Rattasepp: First of all, I tried to present a broader outlook, and I may have put in a quote a little bit out of context, to the detriment of my argument [laughter], because if you ask what a “representation” actually is, people will say “it is something that stands for something else.” I guess then, in that sense, this quote actually becomes nonsensical. “The world is its own best thing that stands for something else”, whatever this means. But in general, there are two very important things to add about representations, which are basically: (1) that what is represented does not have to be present, and (2) a representation does not really have to look anything like the thing represented – words are a perfect example. Yes, there’s not only selection, there is also meaning and agreement when choosing representation and that is probably true when talking about visual representation. But as for the second part of of the question, I really have to think of how to put the two together.

157

Silver Rattasepp

Bibliography Bateson, Gregory. 2000 [1972]. Steps to an Ecology of Mind. Chicago: University of Chicago Press. Clark, Andy and David Chalmers. 1998. “The Extended Mind”. Analysis 58(1): 7-19. Clark, Andy. 1997. Being There. Cambridge: MIT Press. Gallagher, Shaun. 2006. How the Body Shapes the Mind. Oxford, and New York: Clarendon Press. Gibson, James J. 1986 [1979]. The Ecological Approach to Visual Perception. Hillsdale: Lawrence Erlbaum Associates. Held, Richard and Alan Hein. 1963. “Movement-produced stimulation in the development of visually guided behavior”. Journal of Comparative and Physiological Psychology. 56(5): 872-876. Ingold, Tim. 2000. The Perception of the Environment: Essays in livelihood, dwelling and skill. London and New York: Routledge. Ingold, Tim 2001. From the transmission of representations to the education of attention. In Harvey Whitehouse (ed.), The Debated Mind: Evolutionary Psychology versus Ethnography. New York: Berg, pp. 113-154. Ingold, Tim. 2007. Lines: A Brief History. London and New York: Routledge. Noë, Alva. 2004. Action in Perception. Cambridge: The MIT Press. Oyama, Susan. 2000 [1985]. The Ontogeny of Information. Durham: Duke University Press. Oyama, Susan, Paul Griffiths & Richard Gray. 2003. Cycles of Contingency: Developmental Systems and Evolution. Cambridge: The MIT Press. Rorty, Richard. 1979. Philosophy and the Mirror of Nature. Princeton: Princeton University Press. Varela, Francisco, Evan Thompson and Eleanor Rosch. 1993. The Embodied Mind. Cambridge: MIT Press. Thrift, Nigel. 2008. Non-Representational Theory: Space, politics, affect. London and New York: Routledge.

158

Spaces of Interpretation

What do Genetic Maps Represent (and How)? Marion Vorms

My purpose here is to analyze the interplay of two functions of the use of representations by scientists, namely the computational and the representational functions. A diagram, a graph, a microscope image, or any other device that is used and manipulated in scientific theorizing and practice is both a tool of prediction that has to be easy to calculate with and a tool to represent the phenomena under study. I will present the invention of linkage mapping by Alfred Sturtevant in 1913 and the debates it gave birth to around the 1920s as a means to study the way computability constraints and theoretical commitments interplay in the use and interpretation of maps by different geneticists.

The context of the invention of linkage mapping Linkage mapping is a technique invented by Thomas Morgan’s student, Alfred Sturtevant, in 1913. It consists of mapping the relative locations of genes on a linear map according to their relative frequency of recombination. This technique was born in a double disciplinary context, namely the context of the Mendelian theory of heredity on the one hand and cytology on the other hand. These two contexts are different in terms of their experimental techniques as well as their representational practices.

Mendelian theory of heredity in the 1900s: symbolism and experimental practices The experimental practice proper to the Mendelian theory of heredity consists of breeding experiments, where the transmission of hereditary factors (called genes since 1909) is traced by observation of the phenotypes of the individuals in successive generations and deduction of their genotypes according to Mendel’s two laws. Mendelian genetics deals with numerical data and statistics and relies on mathematical combinatorics. All these characteristics are obvious when one considers the “symbolism” introduced by Mendel to express the two principles now known as Mendel’s laws: it consists of representing factors as discrete entities by means of letters or icons that do not commit one to any particular thesis concerning the materiality and internal structure of these entities, but that enable one to apply mathematical combinatorics to them. In this framework, thus, genes are abstract operational units.

Cytology Cytology is the study of cellular processes. It relies on visualization techniques. The representational devices that are used and studied are pictorial in an intuitive sense of the word: they are obtained by microscope techniques – chromosome images are images of concrete material entities. In this sense, the entities cytology deals with are not at the same level of abstraction as those in Mendelian genetics.

161

Marion Vorms

Fig. 1 A

Fig. 1 B



162

Figure 1 A and B: A presentation, in mathematical formulae and in tables known as Punnett squares, of the two laws. From: 1 A: Morgan 1928, p. 2; 1 B: Morgan 1928, p. 91.

What do Genetic Maps Represent (and How)?



Figure 2: Visualization techniques, ‘pictorial images’; observation of ‘concrete entities’: cells, chromosomes; no observational proof for the chromosome theory of heredity. From: Bridges 1916, pp. 110-111.

In the period I am interested in – the first decade of the twentieth century – processes such as cellular division were quite well known, but there was no observational access to the fine structure of chromosomes. In consequence, the chromosome theory of heredity, according to which Mendelian factors (genes) are located on or identified with portions of chromosomes, had no observational basis. It relied on various analogies between the behavior of chromosomes during cellular division and the laws of transmission of genes, but there were still various puzzles to solve in order to defend it, and it was far from being unanimously adopted.

Prehistory of linkage maps From 1906 to 1910, an increasing number of exceptions to the law of independent assortment (Mendel’s second law), which predicts that different pairs of genes should segregate independently, were observed by William Bateson, Reginald Punnett, and also Thomas H. Morgan. They observed that some genes had a tendency to be inherited together, a phenomenon that is called “partial linkage.” Different explanations were proposed for it, according to each scientist’s theoretical commitments and prior beliefs. In 1909, the cytologist Janssens observed that homologous chromosomes intertwine during the prophase of meiosis. This observation, together with the phenomenon of sex linkage, led Morgan to finally adopt the chromosome theory of heredity and to formulate the theory of crossing over, which provides geneticists with a mechanical explanation of the phenomenon of linkage. His hypothesis can be stated as follows:  Thomas Morgan, for instance, who became its main developer after 1911, was still an opponent of this theory in 1910.

163

Marion Vorms

• Genes are ordered in a linear fashion on chromosomes (linearity); • During meiosis, homologous chromosomes, which intertwine, exchange parts (exchange); • Frequency of association of factors (which is constant for any given pair of factors and different for different pairs of factors) is related to the frequency of breaks and is therefore a function of the distance between genes. When two genes lie on the same chromosome, they tend to be inherited together. When they are not, it is said that there is a crossing-over between them, which results in a recombination (an exchange of factors).

Sturtevant’s technique In 1913, Alfred Sturtevant, still an undergraduate, turned Morgan’s suggestion into a representational technique, or better said, a mapping scheme, that transformed numerical data obtained by Mendelian breeding experiments into visualizable distances. This quantitative scheme consisted of using the proportion of observed crossing-over as numerical data enabling one to generate the relative location of genes on chromosomes. Sturtevant introduced the notion of genetic or cartographic distance: the distance between two factors is numerically defined as the recombination frequency, R, of these factors, i.e., the percentage of crossing over. The basic scheme relied on a fundamental hypothesis (already constructed by Morgan), namely the linearity of the arrangement of genes on linkage groups by virtue of which the distances are additive: the distance between A and B and between B and C should give the distance between A and C. By relying on instances of short distances, i.e., factors that recombine rarely, he constructed the map of six sex-linked factors situated on the X chromosome (see figure 3). His focus on short distances was due to his observation that, for greater distances, recombination frequencies were not additive: the observed recombination frequency was less than the sum of smaller distances. To explain such exceptions to additivity, in 1913 he proposed the hypothesis of double crossingover: if homologous chromosomes break and exchange parts at two points, the genes situated at the extremities will recombine twice, and the result will be as if there had been no crossing-over at all (see figure 4 B). Therefore, the notion of distance refers to the real recombination frequency, and real recombination frequency corresponds to observed recombination frequency only for small distances. The real recombination frequencies for genes at greater distances are obtained by addition of the frequencies of those at small distances. The hypothesis of double crossing-over, as well as the interference hypothesis, according to which a crossing-over occurring at some point makes another crossing-over at a nearby point highly improbable, would be the object of many works by Muller, Bridges, and the Morgan group, and would be substantially refined and further developed. It is the core of classical genetics. In 1913, these hypotheses were still very poorly confirmed, but they were essential to the very definition of the notion of distance and therefore to an understanding of the technique of mapping.

164

What do Genetic Maps Represent (and How)?



Figure 3 A: Table presenting the frequency of observed recombination of factors. From: Sturtevant 1913, p. 7.

165

Marion Vorms



Figure 3 B: Diagram of the relative distance of factors from factor B. From: Sturtevant 1913, p. 6.

Let me make some brief remarks concerning Sturtevant’s 1913 maps. First, one can observe that there is the exact same information in the table as in the map (see figure 3). This information consists of numerical data obtained by breeding experiments. Relying on my sketchy description of Mendelism as characterized by its practice of breeding experiments, one can say that these data are purely genetic (Mendelian). The map displays the numerical data in a spatial format without adding any more information than the table shows. Second, the very notion of distance, which is displayed in a diagrammatic way in the map, does not correspond to physical distance. Already in 1913, Sturtevant was well aware, as would be confirmed later, that chromosomes are not likely to break with an equal probability at all points and that distance might well be a measure of strength as well as of length.

The hypothetical status of the chromosome theory of heredity and the mechanical model of crossing-over By 1915, the technique had been considerably developed by Morgan’s group. The frontispiece of their Mechanism of Mendelian Heredity triumphantly displayed the map for the four chromosomes of Drosophila melanogaster (see figure 5). In the following years, much progress was achieved in the study of crossing-over and interference, especially by Muller. Also, the phenomenon of nondisjunction, discovered by Calvin Bridges, provided a strong argument in favor of the chromosome theory of heredity. But it is crucial to my point to emphasize the fact that, until the 1930s, there was no observational proof of the fact that genes lay on chromosomes. The mechanical model of crossing-over, called the “beads on a string model” (see figure 4 B), is a hypothetical analogical model. It explains mechanically the phenomena of linkage and crossing-over. Since the chromosome theory was still hypothetical, and moreover the hypothesis according to which the genes lie in a linear fashion on chromosomes was highly hypothetical, the model does not pretend to depict the chromosomes in a realistic way. In this sense, it is not pictorial in character, in contrast to the images of chromosomes obtained by microscope techniques. Nevertheless, it has in some sense what I would call a “pictorial dimension”: its explanatory function relies on the fact that some topological relations are kept in the spatial representation; if the topological relations were not kept, the model would not be explanatory. I propose to call these kinds of representations, which are not obtained by visualization techniques, and which contain many details that do not pretend to be realistic, but whose representational function consists nevertheless of maintaining the topological relations, “schematic” in kind. I will not get into more detail here, but it is quite obvious that the difference between pictorial and schematic representations is a matter of degree, one can also draw a schematic representation from a pictorial one (see Lynch 1988).

166

What do Genetic Maps Represent (and How)?

Fig. 4 A

Fig. 4 B



Figure 4: Linkage maps of the four chromosomes of Drosophila. From: 4 A: Morgan et alii 1915, p. 60; 4 B: Morgan et alii, p. 62.

167

Marion Vorms



168

Figure 5: Schematic explanation of the mechanism of crossing-over and double crossingover. From: Morgan et alii 1915, frontispiece.

What do Genetic Maps Represent (and How)?

The observational proofs In the 1930s, observational proofs of the chromosome hypothesis were finally presented, which succeeded in convincing almost all the late opponents of the theory, such as Bateson. Observations of chromosome deletions, translocations, and inversions were directly related to genetic aberrations (Creighton and McClintock 1931), and finally in 1934, Painter’s observations of the giant chromosomes of the salivary glands of Drosophila showed microscopically observable banding patterns, which rendered deletions, inversions, and translocations immediately visible, and enabled cytogeneticists to map the diagram obtained by purely Mendelian techniques onto locations on the X chromosome in the visible image. Many predictions that had been made by purely genetic means were then confirmed (e.g., the fact that genetic distance does not strictly correspond to physical distance).



Figure 6: Mapping of the genetic maps onto chromosome images. From: Painter 1935, between p. 306 and 307.

Linkage maps had successfully been used before that proof was given, and one could use them without being committed to the chromosome hypothesis.

The debates around Morgan’s group’s technique of mapping The mapping technique developed by Morgan and his students gave birth to various debates around 1920, which showed that linkage maps could be used by people with different theoretical commitments, and that they could be interpreted and used in various ways. In the following, I would like to suggest that, in the construction, use, and interpretation of maps, there is an interesting interplay of theoretical concerns and problems relative to the tractability and the simplicity of the very presentation of the data.

169

Marion Vorms

Let me briefly describe three important criticisms that were made to Morgan’s group. I will focus on the third one. In 1917, Richard Goldschmidt, who remained a critic of classical genetics until the 1950s, proposed the idea of a crossing-over (i.e., an exchange of genes) without any chiasmatypie (i.e., exchange of parts of chromosomes) occurring. In order to explain the exchange of genes, he described them as charges, linearly located on the chromosomes, but he refused the mechanical explanation. He therefore accepted the linearity hypothesis and the hypothesis according to which genes lie on chromosomes, but he refused to accept the hypothesis according to which chromosomes break and genes are material parts of chromosomes. William Bateson, in his review of the Mechanism of Mendelian Heredity in 1916, criticized the hypothesis that genes are parts of chromosomes. He assumed that genes are linearly ordered, and he therefore acknowledged the relevance of constructing linkage maps, where the distance between genes is calculated from the frequency of recombination, but he refused the chromosome theory as a mechanical explanation. He considered crossing-over an ad hoc hypothesis for which there was no observational proof. He was generally reluctant to consider genes as material units. In 1919, William Castle proposed an alternative model for the linkage maps. His criticism was diametrically opposed to Bateson’s, as he accepted the chromosome hypothesis, while refusing the linearity hypothesis. His arguments against linearity were of various sorts. First, he claimed that it was chemically improbable that chromosomes had a thread-like shape. The main argument was the second one: Castle claimed that the linear model of Morgan and his students did not fit their own data. The disagreement between Castle and his opponents did not concern the data themselves; in his argument, he used Morgan and Bridges data (Morgan and Bridges 1916). But he showed that their model led to absurd predictions. It sometimes gave distances greater than 50, which was impossible if distance was defined as the frequency of recombination. His third argument, finally, was that many ad hoc hypotheses were necessary to maintain the linearity hypothesis: double crossing-over, as well as interference, was dispensable if one renounced linearity. Indeed, Castle proposed a model where distance was always a linear function of recombination frequencies. Therefore, since recombination frequencies were not additive for great distances, the model could not be linear (unidimensional), and it had to be three-dimensional (see figure 7). His main argument in favor of his model was that it fit the data better than Morgan and Bridges’s map. He did not claim that this model was anything more than a predictive tool, but nevertheless acknowledged that, to be a good tool for prediction, the model must correspond to actual relationships, but perhaps to dynamical rather than spatial relationships, which would mean that his model did not claim to be topologically identical to the chromosome structure. Let me just draw a few conclusions from this case. First, the main, explicit issue is about how to present data (about which there is no disagreement). In other words, the main point of the debate between Castle’s and Morgan’s groups concerned the format in which the data were to be presented, in order to facilitate predictions and enable scientists to draw valid inferences. Nevertheless, my main point is to show that, even when neither opponent claimed that his model was realistic (Morgan and Castle were both instrumentalists), the debate cannot remain at a purely instrumental level. The various criticisms that were made about Sturtevant’s model show that one can adopt the chromosome theory without accepting linearity and vice versa. Linkage mapping, as Morgan’s group practiced it, relied on the linearity hypothesis and not on the chromosome theory. Still, I would like to suggest that the debate with Castle shows that Morgan’s method cannot be properly understood without taking into account his commitment to the

170

What do Genetic Maps Represent (and How)?

chromosome theory as a mechanical model of crossing-over that relied on the assumption that genes are material parts of chromosomes. In order to show that, let me ask the following questions: What do genetic maps represent? And how do they do so? First, one would like to say that genetic maps are representations of chromosomes. They are called chromosome maps; they were invented according to Morgan’s hypothesis and adoption of chromosome theory. The four maps for Drosophila correspond to the four chromosomes of the fly. In addition, maps enabled earlier geneticists to gain knowledge concerning the structure of chromosomes, which was confirmed afterwards by cytological observations. Even if genetic distance was not physical distance, some topological relations remained (the respective ordering of the genes). All that became clear when the linkage map was mapped onto the microscope image. But if one focuses on the mode of acquisition of the maps, and on their syntactic properties, one cannot reasonably say that they are a visualization of chromosomes, even with some distortions. As I have said, the diagram and the table contain exactly the same information. In that sense, the maps were a purely graphical presentation of numerical data (visualization of the data, not of the chromosomes), like a graph, that makes some information immediately or more easily available. Their function as representations and prediction tools is independent of the actual material support of genes. The experimental technique that led to their construction was, again, the technique of Mendelian breeding experiments. As Bateson’s criticism and the debate with Castle show, there was no observational basis for the chromosome theory and for the hypothesis of the linear ordering of genes along chromosomes around the 1920s. The debate with Castle mainly concerned the best presentation of data, and none of the opponents would have argued in favor of a realistic attitude towards the model (no one would say that spatial relations shown in the map represented actual spatial relations). Nevertheless, I suggest that one should draw the analysis a bit further. Although any model that was supposed to explain linkage had to be hypothetical in character, and no one claimed that they had more than a tractable and useful model for predictions, one can observe that, in the model of Morgan’s group, the underlying mechanical explanation was not only useful but indispensable if one wanted to properly use and interpret the maps, as Castle’s “misunderstanding” (his mistaken view of distance, which made him ignore the fact that great distances were not recombination frequencies, but were deduced from recombination frequencies) showed. Indeed, the spatiality of Sturtevant’s map adds something more than what a graph adds to our cognitive abilities to solve problems and draw inferences. Certainly, distance, crossing-over, double crossingover, and interference can be defined numerically at a genetic level, but, to put it bluntly, the introduction of the notion of “distance” and the spatiality of diagrams was not purely metaphorical: the understanding of the whole model and of the construction of maps crucially depended on the underlying hypothesis concerning the material structure of chromosomes. In that sense, the spatiality of the maps had some kind of pictorial (or schematic) dimension: their spatiality played a role in understanding how they worked. The chromosome hypothesis was not only the suggestion at the origin of their invention, but it also guided the understanding of the maps. One could not properly understand the map if one did not adopt the hypothesis of the materiality of chromosomes and of the linear ordering of genes along them. Castle’s “misunderstanding” could only be accomplished by relying on a notion of distance that was absurdly complex if it was not based on a material hypothesis. This is particularly clear when one gets into analyzing the various ways the terms “crossing over” and distances are used and understood. Indeed, in Morgan’s technique, distance was numerically defined, not as the frequency of observed recombination, but rather as

171

Marion Vorms

the frequency of real recombinations, which were not all observed. Therefore, there was some kind of realistic physical aspect of the distance.



172

Figure 7 A: Castle’s rat-trap model, from Castle 1916. From: Castle 1919, p. 29.

What do Genetic Maps Represent (and How)?



Figure 7 B: Morgan and Bridges’ map. From: Castle 1919, p. 27 (he himself reproduced a diagram from (Morgan and Bridges 1916, p. 23).

173

Marion Vorms

In conclusion, I would add that the classical picture according to which Castle is an instrumentalist and Morgan a realist in this debate does not properly fit the facts. For both, realistic concerns and computability constraints interacted, and theoretical commitments played a role in the choice of one or the other form of representation, even if their explicit concern was to find the best way to present their data. The use and construction of maps was in principle independent of the chromosomal reality, but their interpretation depended crucially on it and thus, on the way the data were presented. Even if it was a matter of convenience or predictions, etc., the underlying mechanical model played a role.

Discussion Bruhn: I wanted to support your idea of the topology with another example that, maybe, is interesting for you because whenever you want to discuss this problem that is inherent in diagrams, you can also use a 1930s or 1940s subway map. For example, the London tube where you see that. They tried to standardize forms and caused troubles because it made it more readible [than] they thought. People would connect the distances to the times of travelling because they had some topology and topography in their minds. The association is the representation of topographical relations, the distances. And, I think that might be very interesting because the production of these subway maps was done [with] the idea that from these maps, people would learn to abstract. And so they didn’t need any legends below saying, “Please keep in mind that they don’t stand for the time of travelling in minutes.” With this kind of diagram, you have a sort of three-dimensional modeling of the problem. You sometimes have a scale because you need a scale. So the diagram is not only the spatial model inside, but also the scale is included. And that is something that needs us to call it topological because there is something of topological explanation inside. So it is not only diagrammatic, or visual, there is something more in it. Metscher: I want to add something to Christina’s question about Castle’s diagram. It seems as if I understand what he was trying to represent actually contains some information that was not in the plots that Morgan and Bridges had. The more it contains, the more the things are not so consistent and not so easy to explain. So, the linearity does not add up and there must be some explanation for that. And if it is actually like a rigid-beads-on-a-string sort of model then they could always add up – it looks [like] something that doesn’t fit so good. Vorms: The bead-on-the-string model is an explanation for the non-additivity of the doublecrossing over. Metscher: So what I was wondering it is really messy and difficult to look at. This is awful [pointing to Castle’s image], this is very difficult. Well, they were drawing a picture of a chromosome, this is just a plot of it. Later, people who knew about chromosomes looked back at it and said, ah, that’s a chromosome. So, we can discount that completely and that answers why. We don’t look back even if it contains interesting information.

174

What do Genetic Maps Represent (and How)?

Bibliography Bridges, Calvin B. 1916. “Non-disjunction as proof of the chromosome theory of heredity” (part 2), Genetics, B, vol. 1, pp. 107-163. Castle, William E. 1919. “Is the arrangement of the genes in the chromosome linear?”. Proceedings of the National Academy of Sciences of the United States of America 5(2): 25-32. Creighton, Harriet B. and Barbara McClintock. 1931. “A Correlation of Cytological and Genetical Crossing-over in Zea Mays”. Proceedings of the National Academy of Sciences of the United States of America 17: 492-497. Lynch, Michael. 1988. “The externalized retina: Selection and mathematization in the visual documentation of objects in the life sciences”. Human Studies 11: 201-234. Morgan, Thomas H. 1928. (revised and enlarged edition. 1st edition: 1926), The theory of the gene, New Haven: Yale University Press. Morgan, Thomas H. and Calvin B. Bridges. 1916. “Sex-linked inheritance in Drosophila”, in Carnegie Institute Washington Publications, vol. 237. Morgan, Thomas H., Sturtevant, Alfred H., Muller, Hermann J. and Calvin B. Bridges. 1915. The Mechanism of Mendelian Heredity, Henry Holt Company. Painter, Theophilus S. 1935. “The Morphology of the Third Chromosome in the Salivary Gland of Drosophila Melanogaster and a New Cytological Map of this Element”. Genetics 20: 301-326. Sturtevant, Alfred H. 1913. “The Linear Arrangement of Six Sex-Linked Factors in Drosophila”. Journal of Experimental Zoology 14: 43-59.

175

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s Norberto Serpente

Introduction A visual examination of textbooks on cells belonging to different historical periods from the start of the twentieth century to the present, allows us to discern some important features of cell biology imagery. Firstly, that the overall visual landscape of cell biology has been created by two different cultures or traditions of thought and practices, the cytological on the one hand and the molecular on the other. Secondly, that the balance of the deployment of these images coming from each tradition has shifted throughout the years with the cytological being superseded by the molecular, a process that has quickened since the early 1980s. This relentless increase in images belonging to the molecular culture and dealing with invisible entities in cell biology from the early 1980s is intimately related to the growth and dominance of molecular biology inside the field. Here I refer to molecular culture, a rather wide term, as the belief that actions of molecules would explain all biological phenomena. Molecular culture is not exclusive to the development of molecular biology. Is also at the foundation of the experimental practices found in nineteenth century organic and physiological chemistry, its twentieth century continuation, biochemistry and molecular biology. Therefore, the 1980’s is not the first time that molecular culture entered the field. During the 1940s through 1960s biochemistry began to have a strong foothold inside the discipline that at the time was known as cytology. The visual changes I am referring in this study were promoted, although not exclusively by the following factors. Firstly, the deployment and subsequent standardization of new techniques and practices from biochemistry and molecular biology in cell biology laboratories. Secondly, a growing demand in the early 1980s for didactical material directed toward an increasing population of students entering biological courses in the United States and Britain aimed specifically at teaching them the nuts and bolts of an highly operational and expanding DNA recombinant technology. The conceptualization of the new visuality emerging in cell biology in the early 1980s deserves a different approach from that offered by classical analysis of either visual representations in science or the ontological status of the entities involved. Two currents of thought bring a fresh and stimulating perspective to this imagery shift, namely semiotics and simulation theory. Semiotics will help us to characterize this visual change (why it is meaningful) and simulation theory, in particular the variant developed by the French theorist Jean Baudrillard will give us some hints into the cultural context where these visual changes have emerged and unfolded. This paper expects to contribute to debates on the relationship between images and the knowledge production of representational practices in cell biology, an area that remains largely uncovered by studies in the history of bioscience.





The study focuses on cell biology textbooks written in the English language and published mainly in Britain, and the United States. For the different aspects of the history of molecular biology, see Judson 1996; Kay 1993; Morange 1998. Molecular culture relates to the previous concepts of ‘The molecular vision of life’, Kay 1993 and ‘molecularisation’, Chadarevian de & Kamminga 1998.

177

Norberto Serpente

The visual history of the unit of life By examining side-by-side, cell biology textbooks books from different periods a feature becomes evident that otherwise would be concealed when only examining current editions. Contemporary editions of cell biology textbooks conflate into a unified vision what which has been constructed historically by two different cultures, two traditions of thought and practices that have respectively given rise to two different kinds of visual orders, the cellular (first order) and the molecular (second order). This side-by-side examination, not only allows us to see the existence of different visual paradigms in cell biology and the cultures involved in their production, but to see the changing visuality of cells throughout history. Figure 1 serves to visualize all the points that will be raised in this paper. The first point states that cells have had a visual history. Throughout time, cells have been viewed, depicted, represented and presented in different ways and these visions, depictions representations and presentations have hinged on different kinds of instruments and techniques, on different internal and external cultures surrounding their production, and on conceptual differences such as representing and presenting (I will explain below how I distinguish between these). The image of the unicellular amoeba (Fig. 1 A) belongs to the optical tradition and represents what I call first order visuality. These kinds of images are created with the aid of optical microscopes and commonly by using specific staining techniques. The second image (Fig 1 B) is that of a plant cell as produced by an electron microscope using specific sample preparation techniques, which are very different from those used with the optical instrument. The electron microscope is an instrument regarded by cell biologists as extending the visual power of the optical instrument and thus producing reliable images. Whilst I will grant to the electron microscope this capacity and hence to its images as belonging to the first order visuality I will consider them as a transition between the types of images shown on the left and on the right in figure 1. The third image (Fig 1 C) is representative of the second order visuality; it is a map of protein-protein interactions, occurring inside a cell, known as the ‘interactome’ (Giot et al. 2003). The production of these kind of images, not only entails a different set of cultural practices and instruments than those used by the cytological tradition, but a complex process of visual translation from one kind of traces (Knorr-Cetina and Amann 1990) (Fig 1 D) into a kind of morphological type of image of cells image like the one created by the optical tradition. Looking over these images from left to right allows us to walk through the visual history of cells, a history that involves a process where the visible cell as a referent (object) has gradually disappeared, and a manufactured symbol has taken its place instead. A quick overview of the main features exhibited by the two traditions of thought behind the two orders of visuality allows us to have a better idea of how this process unfolded.

First order visuality, or the cytological tradition The visual tradition uses optical devices such as the light microscope to create images of cells resulting in a visuality that will be considered as merely an extension of our visual senses (first order). This visual tradition was inaugurated by A. Leeuwenhoek who in the 1660s described 



The first and the second-order visuality are two operational categories that allow me to frame my study. The terms would become clearer as the text progresses. However, briefly here, they respectively refer to images of cells obtained with the optical and the electronic microscope (first-order) and to images of cells as created from instruments with other inscriptional outputs different than ‘visual’ (second-order). For some of the controversies on the capacity of the electron microscope to extend the visual power of the optical instrument see the following works: Hillman 1980; Rasmussen 1993; Rasmussen 1995.

178

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

having seen for the first time animalcules (bacteria) with his single lens microscope, and almost simultaneously by Robert Hooke who in his Micrographia (Hooke 1665) coined the term cells to refer to the empty compartments he observed in petrified and burnt pieces of wood. These observations were the first steps into the microscopic realm and as such were far from belonging to a defined body of knowledge (Hooke 1665). It took until around 1850 before all the varied and disparate observations of this visual tradition were integrated into a kind of unifying concept by the work of Matthias Schleiden and Theodor Schwann together with that of Rudolf Virchow and Robert Remak. The cell theory was formulated in 1839 and observations from the visual tradition were vital not only for its formulation but also for its success in establishing the canonical knowledge of cytology through the twentieth century (Hooke 1665).



Figure 1 A: Amoeba as seen through a light microscope x 100 (http://faculty.clintoncc.suny.edu / faculty /michael.gregory /files / Bio% 20102 / Bio% 20102% 20lectures /protists /protists.htm).



Figure 1 B: Electron Micrograph of a transverse section of a root tip of the bean Phaseolus Vulgaris . Illustration by Neil O. Hardy (from De Duve 1984, vol. 1, p. 25). Copyright 1984 by the De Duve Trust, Francois Vischer, trustee. Reprinted by permission of Henry Holt & Co., LLC.



Figure 1 C: A protein-protein interaction map ‘interactome’. From L. Giot et al. 2003, fig. 4. Reprinted with permission from A A AS.



Figure 1 D: Protein-protein interaction from a Immunoprecipiation / Western Bloting experiment. From Serpente 1996, fig. 8. Reprinted with permission from Molecular and Cellular Neurosciences of Elsevier B.V.

The study of cells as in many other areas of knowledge in the biosciences has undergone important changes in its academic status and its epistemic and empirical focus since the formulation of the cell theory by Mathias Schleiden and Theodor Schwann in 1839. Almost exclusively all knowledge of cells from the formulation of the cell theory in 1839 to the 1900s originated from studies belonging to different practice-clusters (integrated nowadays into what is known as biology) such as botany, zoology, histology, physiology, fertilization, development, heredity, bacteriology and pathology. A key textbook in the creation of cytology as a systematic body of knowledge constructed from the practice-clusters of heredity and development is that of Edmund Beecher Wilson (Wilson 1925). The study of cells in Wilson’s textbook was by and large (as typical of those times), anatomical and descriptive. Cells were essentially observed with a light microscope and then depicted through the use of drawings made either directly from the microscope or from daguerreotypes. Most of the observations were done with as little intervention as possible on one hand to avoid compromising the cell’s integrity and on the other to adhere to the canons of a regnant ‘mechanical objectivity. Two main kinds of images were featured in Wilson’s textbook 

Daston and Galison 2007. Mechanical objectivity in the form of photography was a key epistemic virtue that emerged at the end of the 19th century. The attainment of mechanical objectivity was an important and cherished condition that allowed cytologists to draw conclusions about the living matter.

179

Norberto Serpente

based on the part of cells that was under focus, either the cytoplasm, or the nucleus. While images of cell division generally came to be accepted more rapidly, as true to nature, depictions of subcellular structures were more resistant to being accepted as such. From the outset of the twentieth century until the 1950s when the electron microscope started to be extensively used, the cytoplasm was still visually elusive. It was described as amorphous, granular, fibrilar and/or globular. Its putative structures such as the plastidules, the biophores, the bioplasts or the miscellae (none of them currently considered structures) represented only chancy attempts to partition it. As years passed by, the failure of the light microscope and the use of staining techniques associated with it to deliver a convincing imagery of the structure of the cytoplasm greatly frustrated cytologists. It also facilitated the search for a new visual technology that could satisfy their growing need to know about the material structure of the cytoplasm. The electron microscope was that new visual technology, which by the 1940s promised advances at the level of extending the visual power of the light microscope (Rasmussen 1997). However, as countless images alongside theoretical discussions in journals and textbooks of the time suggest, not all images were easy to interpret (Hillman and Sartory 1980. See also Rasmussen 1997). All suspicions concerning the capacity of the instrument to deliver reliable images began to recede when Porter, Claude and Fullam produced ‘crystal clear’ images of cultured fibroblasts in 1945 (Porter et al 1945). Placing side-by-side two images of the same kind of cultured fibroblasts as taken by the optical and the electronic microscopes respectively was key to arguing for the electron microscope’s capability to extend the visual power of the light microscope. Their strategy, known as the continuity of vision, was not new; it was in fact used almost three hundred years ago by Robert Hooke in his Micrographia to turn his magnifications credible and familiar. Despite this, the 







A historical shift towards abstraction has been detected in the successive editions of Wilson’s treatise and has been conceptualized as a move from presentation to representation by Jane Maienschein (1991). This move, Maienschein argues, involved for instance the substitution of just depictions of cellular processes such as cell division, as observed under the microscope with depictions that embodied a theoretical view (Maienschein 1991). The main reasons for this were firstly, the work of T. Boveri and W. S. Sutton who showed around 1900 that individual chromosomes carried different hereditary determinants and secondly the work of Thomas Hunt Morgan and co-workers, who, working with the fruit fly between 1910 and 1915 showed that the genetic segregation of Mendel’s hypothetical factors corresponded to the segregation of chromosomes (‘real’ observable entities) during cell division. No such association of function to an observable entity could be applied to the cytoplasmic structures at this time. It was only during the 1940s through 1960s that the use of the ultracentrifuge allowed sub-cellular structures such as mitochondria to be isolated in relatively pure fractions and thus to be biochemically analyzed. Although cytoplasmic structures such as mitochondria and the Golgi complex had been described before the twentieth century the optical microscope was unable to define their structure (they were too small or produced different visualities depending on the sample preparation and staining method used). Even more problematically for their reliability was the fact that no cellular function could be associated with the structures, as was the case for the nucleus with its chromosomes and cell division. Plausibly, other related events had played a negative impact on the development of a cytoplasmic visuality from the 1920s to the early 1940s. It is known for instance that German embryologists who argued for a cytoplasmic role in inheritance in the 1920s and 1930s were in open opposition to what they dubbed the ‘nuclear monopoly’, of the American school represented by Morgan’s Mendelian-chromosome theory of inheritance (Sapp 1987). Their lack of success in finding cytoplasmic structures comparable to the nuclear chromosomes may have contributed negatively to the development of a visuality and hence to the further development of the theory. The continuity of vision argument is a philosophical argument for scientific realism. It holds that entities existing in the world are in a smooth continuum, that there is not a divisive point between the observable and the unobservable. In other words that there is an ontological continuity from what we see to what

180

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

attainment of visual reliability for many remained far from certain (Hillman and Sartory 1980; Rasmussen 1997). Not only did a complex process of sample preparation put the images at risk of being considered as artifacts, but also, at the other end, once the images were produced, a long process of learning to see along with a tacit agreement on what was being seen had to be settled. Of particular relevance for this process was the use of visual strategies created to orient the viewer to a particular interpretation. The deployment of paired representations, a practice that emerged by the early 1950s and that would endure in slightly different forms well into the present, was critical for this process (Lynch 1990). In a paired representation, a drawing or diagram is placed alongside an electron micrograph with the aim to facilitate the process of interpretation of the raw image. As Lynch argues, the drawing might even take the form of a model when it adds theoretical information that normally cannot be found in the photograph (Lynch 1990). It has to be mentioned that by the time the electron microscope began to be used in cytology and cytogenetics, a substantial change occurred when a group of cytologists (Palade, Claude, and Fullam among others) who were dissatisfied with the limitations of morphological descriptions of cells began to use the electron microscope, alongside new tools such as the ultracentrifuge aiming to associate particular sub-cellular structures with particular biochemical functions.10 They soon claimed to know the biochemical mechanisms (reactions) that enabled the observed sub-cellular organelles to perform their functions inside cells. Evidence was accumulating for protein synthesis occurring in microsomes, and cell metabolism inside mitochondria for instance. Partly as a consequence of this, by the 1960s the discipline changed quite dramatically, new scientific journals were created, new research clusters emerged and quite emblematically cytology was re-branded as cell biology.11

Second-order visuality, or the molecular tradition The cytological tradition was not the only culture of knowledge focusing on cells as objects of inquiry, biochemistry also developed a specific tradition. Before the 1940s, biochemical practices, known at the time as organic or physiological chemistry, were oriented to unraveling biochemical reactions independently from where they occurred, that is independently from cellular structure. In fact, biochemical knowledge was produced out of broken cells (in vitro systems) (Florkin 1972). An important realization for biochemists, one that brought them closer to the cytologists’ interests, was that biochemical reactions needed cellular structural organization (Fruton 1992). From the 1960s onwards, biochemistry would have a growing presence in the new discipline of cell biology and would contribute with its own visuality. Before merging with the cytological tradition, biochemistry, and its forerunners organic and physiological chemistry depicted molecules in different ways such as paper formulas of organic molecules, space-filling 3D models of polypeptides in the late 1940s and DNA in the mid-1950s and computer-assisted depictions of we cannot see. In cell biology it is based on the fact that there are overlapping areas of vision between the naked eye and a magnifier, between a magnifier and an optical microscope, and between this last and an electronic microscope. The philosopher Grover Maxwell, who first developed the argument, claimed that all entities are observable under suitable circumstances by using the proper instruments like a magnifier, an optical microscope, and so forth (Maxwell 1962). 10 Bechtel 2006. See also Rassmussen 1993; 1995; 1997 and Rheinberger 1995; 1997. 11 A couple of recent studies show the complexity of this particular re-arrangement of research clusters into emergent disciplines. Brauckmann describes the importance played by tissue culture as an emergent discipline in driving the creation of the new of cell biology in Europe and the United Sates (Brauckmann 2006 a&b) and how the technical evolution of cell culture was capitalized on by a group of cytologists, among them Porter, who actively promoted a growing molecularization of cell biology (Brauckmann 2006b).

181

Norberto Serpente

globular macromolecules (1980s) (Klein 2003; Chadarevian de and Hopwood 2004). Despite the ever-increasing explanatory power that biochemistry started to have inside cytology, its imagery would remain separated from that of the cytological tradition in cell biology textbooks well until the early 1980s. For instance, electron micrographs of mitochondria and the biochemical processes such as the Krebs cycle that were described in the text as occurring inside those sub-cellular structures like the Krebs cycle did not overlap visually, that is, they remained in cytology textbooks as literally separated images.12 During the late 1970s and early 1980s a new kind of molecularisation entered cell biology. New biochemical techniques emerged (electrophoresis, monoclonal antibodies, immunoprecipitation/Western blotting) and began to combine with those of molecular biology (genetic engineering, recombinant DNA technology) to create new experimental set-ups and soon become the basis for the development of new research areas such as signal transduction and gene regulation.13 This phenomenon went on to create a new kind of imagery for the molecular tradition, an imagery that would began to compete with the imagery created by the first-order visuality.

The emergence of the second-order visuality in Molecular Biology of the Cell (MBC), a key textbook for bringing the molecularized cell to massive audiences. If there is a book that embodies the process of the visual molecularization of cells it is Molecular Biology of the Cell (MBC) from Alberts et al first published in 1983 and currently in its fifth edition in 2008 (Alberts et al 2008). The book was an initiative by James Watson in the late 1970s. Encouraged by the success of his previous textbook on prokaryotes, Molecular Biology of the Gene (Watson 1965) Watson thought time was ripe to present the expanding knowledge on higher organisms that just began to accumulate. What also encouraged Watson was a market research conducted in biology departments mainly in the United States that spotted the need for a cell biology textbook able to introduce students to the new and blossoming techniques of molecular biology, in particular genetic engineering with its promise to intervene in life processes.14 MBC proved to be a real success not only among students because of its pedagogic qualities, but also among professionals because of its summarizing qualities in an era of relentless expanding knowledge. MBC set a new wave of molecularisation of the cell underway and there are no better words than those from Albert’s textbook to exemplify it: “In a sense, cells are halfway between molecules and men”.15 This textbook was a kind of visual exploration where for the first time, now-familiar themes such as walking molecules presented using cartoon-like depictions, would first appear in printed form.16 MBC not only expanded the pictorial tradition of biochemistry, but also provided a fertile ground for the second order visuality to flourish. In fact MBC was one of the first printed media where the depictions of signal transduction processes and Jacob and Monod-style models on gene regulation in eukaryotes appeared. Molecules perceived as having a globular form (with a pattern of similitude to how cells were depicted in older textbooks) were depicted as interacting in complex networks and changing shape after those interactions. These kinds of depictions of invisible molecular interactions continued 12

See for instance figures 11-11 and 11-17 in De Robertis & De Robertis 1987. Gene regulation until that time has been studied in prokaryotes (bacteria) being the Jacob and Monod operon model its best example. I refer to the extension of that kind of studies to eukaryotic cells. 14 Authors interview with Professor Martin Raff and Dr Julian Lewis coauthors of MBC in London 2007; Roberts 2003. 15 Alberts et al 1983. Preface. 16 Alberts & et al 1989, Figure 3-63. p. 131. 13

182

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

to grow not only in the later editions of Alberts et al. textbook, but also, in articles and in catalogues from biotech companies selling fluorescent-linked antibodies directed against the molecules involved in the signaling process.17 All this helped expand research in cell biology in the direction of post-genomic systems biology, a complex biology based on models of proteinprotein interactions, to which the interactome belongs (right image on Figure 1). The impact of MBC would be such, that after the first edition of 1983 all forthcoming cell biology textbooks more or less used the kind of imagery pioneered by it.18 An important aspect is concealed from the whole process of image production in MBC in particular and cell biology in general: that the second-order visuality has been largely constructed from the interpretation of the traces invisible entities (molecules) leave in a given inscriptional matrix such as a radiogram (Knorr-Cetina and Amann 1990) (Figure 1, D). What is evident from what we have assessed so far is that the cell as a referent, the one created by the visual tradition, has been eclipsed at the expense of the images created by the molecular tradition, a tradition that has taken instead as the referent the interaction of invisible molecules and the biochemical processes in which they are involved. As a result a ‘new’ cellular anatomy is created (see Fig. 1 C the interactome with the different colors green, blue and yellow representing respectively the nucleus the cytoplasm and the cell membrane). What also results concealed in the process is that the production of the second-order visuality entails a complex translational process, with a rich procedure of decisions and particular interpretation of how one type of inscriptions or molecular signs become meaningful in a different domain. The following will help to substantiate my argument for a semiotic change in cell biology from the late 1980s, one that entails a change from icon-type images of cells as observed under the optical microscope, to symbol-type, images like the interactome.

Cells: from icons to symbols Semiotics will serve us to characterize the nature of the change observed in the visual history of cells (Eco 1976; Chandler 1997). The process of signification, or semiosis, is based on an interrelationship among the three following three elements: first, a signifier which is the form in which a sign appears; second a signified, the sense that is made of it in our minds, and third, a referent; the object to which the sign refers.19 Of particular relevance is the conceptualization of signs in three different types: icons, indexes and symbols, depending on the quality of the relationship between signs and referents and the degree of convention used to associate them.20 The amoeba cell (Fig. 1, A) could be granted the category of an icon, that is, a sign whose signifier bears a degree of resemblance to the object/referent. This works only if we accept that the 17

Cell Signalling Technology Inc. catalogue 2005. One typical case is that of Darnell, Lodish and Baltimore 1989. 19 This is a slight modification of the semiotician Charles Sander Peirce’s (1839-1914) triangle of signification with the elements, signified, signifier and referent (object) in each vertices of the triangle (Chandler 1997). 20 Originally developed by Charles Sanders Peirce (Hoopes 1991). Convention refers to the linguistic justification necessary to explain the connection between a referent/object and its sign. For icons due to the relation of similitude between referents and signs the use of convention is minimal depending on how far a sign departs in similitude from its referent. For indexes the convention although direct due to its physical connection between sign and referent (sound of a train and the train itself) depends strongly on cultural convention. The degree of convention for indexes could result equal or bigger than the one established for icons (depending on the culture that hears a sound of a train to associate with the actual train, for instance). Convention became absolutely necessary for symbols where the relation between referent/object and sign is purely based on it. 18

183

Norberto Serpente

microscopic image, made with increased magnification, has a relationship of continuity with an observation of it made with the naked eye. In fact, with good light and contrast conditions big amoebas could be perceived with the naked eye.21 The second image, (Fig. 1 B) could be considered an index. Relying on the continuity of vision argument the relationship between signifier and signified is perceived as more physical or causal (in relation to a previous image). However, more than for an icon, with an index, the relation hinges more on convention than on a relation of resemblance, or better put, the relation of resemblance is given by a convention (think of the discussions that took place to justify the images obtained with the electron microscope as having visual continuity with those obtained with the optical instrument; see footnote 9). Finally, the interactome (Fig.1 C) constitutes a symbol. The status of symbols is based on pure convention, for there is no relation of resemblance or physical connection between signifier and signified due to the impossibility of access to the referent (molecules), hence the impossibility of comparing both. What is more, the idea of visual continuity as observed for the case of amoeba or the electron microscopic image is also an underlying assumption at stake for the images from the secondorder visuality such as the interactome. Second-order visuality images are thus granted a double status, one of indexicality (the direct physical link between image and referent attributed to photographs) and one of iconicity through the idea of visual continuity. Put it differently the interactome, a symbol, builds its authenticity as an image, in part, because of its capacity to embody ‘traces’ of indexicality and iconicity.

Representation, presentation, invisibility and translation A further point I am keen to discuss in this paper concerns the inappropriateness of the concept of representation, especially for the description of the second-order visuality category of images in cell biology. My argument on the limitations of the concept of representation when dealing with invisible entities is based on Lynch’s plea for an open investigation into the uses of the concept of representation (Lynch 1994). Lynch argues that science studies have revealed that the scientific representation is at times used indistinguishably either to refer to pictures, drawings or photographs. The term is also used imprecisely to describe things that are either visible or invisible riskily conflating the different origins and qualities of each.22 If we take representation (as mimesis) to describe the condition when the referent is directly perceptible to the eyes and hence comparable to it’s rendering, then we cannot conceptualize the two images of cells in Figure 1 (B and C) as being representations. A resemblance check between referent and its rendering is only possible for the image of the amoeba (Figure 1 A) arguably possible for the electron microscope image (B) if we believe on the continuity of vision, but definitely not for the image of the interactome (C). Images of the second-order visuality such as that of the interactome are clearly presentations rather than representations in the sense that they are based on acts of free-creation rather than on acts of imitation or resemblance. A further reason why the use of the term representation is inadequate when referring to the images B and specially C in Figure 1 is that in part each of these images derives from different 21

Issues of observational/visual continuity are discussed in Chapter 11 (Microscopes) in Hacking 1983. As we saw earlier Robert Hooke (Hooke 1665) made in fact the first attempt to establish this relation of continuity (between naked eye and microscopical observations). This relation of visual continuity is the same that Porter, Claude and Fullam used to probe the capability of the electron microscope to produce reliable images when they put the pictures of cultured fibroblasts taken by the optical and the electron microscope side-by-side. 22 This is due in part to the fact that the meaning of representation currently overlaps (is used indistinguishably) with other concepts such as imitation, mimesis, copy, all concepts that have an complex meaning association throughout history.

184

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

processes of translation, a process of editing and transforming one type of image into another, a process that is invisible to the audience not involved in its production. Any visual depiction of a referent (visible or invisible) will entail a process of translation, that is, a process where several steps of inscription, transcription, and/or fabrication take place through a chain of decisions involving several actors, technological devices and normative settings (Pauwels 2006). The process of transformation (signification) from referent/object, to the final product, visual image, grows in complexity when dealing with invisibles, these being either entities such as molecules or electrons or concepts such entropy or alienation. Consequently, it will require more explanatory steps to justify their ontological status. This is clear in the three images of cells in Figure 1 (A, B and C); as we move from left to right, from icon to symbol, it is evident that the level of explanation needed to justify an image grows in complexity. One important point to retain from all this is that despite their disparate origins what all these images have in common is that they are all making an epistemic point about what cells are, a point that hinges on the well defined historical moment in which they were produced.

Is the map preceding the territory? A final point to ponder in this paper is whether the growth of the second-order visuality, in other words, the move from representation to presentation in cell biology has created a condition of hypereality where the difference between the real and the unreal no longer holds, a condition that may affect the highly praised operational capacity given to pictorial representations in science by its practitioners.23 Hypereality refers to a situation where signifiers are freed from the necessity of refering to reality. They are produced again and again as equivalents for one another; a condition where the distinction between what was considered real and unreal (models) becomes hazy. This is similar to the condition semiotics describes as a move from an iconic mode into a symbolic one. Jean Baudrillard describes a condition of twentieth-century Western societies, especially from the late 1970s onwards where the circulation and consumption of images is taken to be more important than the production, circulation and consumption of physical real commodities (Baudrillard 1981, 1993, 1999). He refers to the flurry of ideal images on lifestyles, self-appearance, cars, houses, drinks and other products that becomes determinants of the real for consumers. This condition or ‘logic of simulation’ is so pervasive that once born, it remained as the main organizing principle in these societies. It is, in my view, not by chance that the move from the iconic to the symbolic in cell biology resembles this wider tendency within our Western culture of visualizing objects that are not in themselves visual, a tendency that is characterized by a concomitant desire to replace understanding with aesthetic pleasure. After all the visual history of cells as portrayed in Figure 1 emerged and runs in parallel with many of the developments of this hyperreal culture. Does this universe of images in cell biology contain only a highly sought-after aesthetic and value? Are these images acting as end-in-themselves within cell biologist’ work? Are they only visual commodities getting more and more distanced from the real world from where they have emerged? Are these images playing only a didactical role oriented to sustain the molecular paradigm in cell biology? George Canguilhem (Canguilhem 1965) expressed these questions as deep concerns. He thought that twentieth-century biology was fascinated by the prestige of the physico-chemical sciences and consequently it was reducing itself to a satellite of these. Such a 23

By operational I refer to the capacity of models to produce changes in the world. More specifically how effectively the second order visuality continues to successfully explain phenomena in cell biology and simultaneously to create new experimental conditions to change those phenomena.

185

Norberto Serpente

reduced biology, he remarked, implies a corollary of the disappearance of the biological object as such. Canguilhem although from a different perspective to that of Baudrillard, was anticipating the possible consequences of the relentless growth of molecular culture on cell biology. Before history reveals the answers to these questions, I think it is worth wondering whether or not images of cells belonging to the two traditions of thought the cytological and the molecular could co-exist in the long term.24 Or, whether the molecular image has created a condition where, the map is preceding the territory as the writer Jorge Luis Borges described in one of his tales when referring to an Emperor that asked to his cartographers to map his territorial domains.25

Discussion Bruhn: Is it correct to say that the second-order visuality is also indicated by the fact that there are other people involved and producing them? I mean, we have a graphic specialist, someone who has to draw it and to purify the images, whereas experimenters who pretend that they produced the images themselves sitting at a machine and using marking technology, etc. And the other ones are not only highly worked through, more elaborated, but they also involve other people to produce them. Serpente: Yes, absolutely. And I think one of these changes we are witnessing – what I called from the iconic to the symbolic way – is that, apart from involving other people in the production of images, there are technologies. It is a way of making, recreating a new way of looking at things. But, yes, they are all participants, and it came to my mind. Now if you tell from the acknowledgments of the classical textbooks, there were maybe twenty people to acknowledge. Compare with Albert’s book: there is a huge amount of people to acknowledge because they just called out people asking, “Do you have pictures for me?” And so, the production of this textbook is quite different, creating this virtual witnessing through a different way of relationship, completely different to the one that was created by the old-type ones. Brandstetter: I am just wondering about this very complicated image you showed, like the complicated images before. Did they have a different function also, like evoking a feeling of the sublime, a feeling that things are more complex and sophisticated than you might think at the first instance? And so, that they have to sort of caution us against reductionism and linearwhatever sort of explanations. Serpente: Yes, I am sort of glad that you mentioned that. This idea of the sublime, even sometimes they are quite open to say, “Well I’m being into art before I call it my lab.” Some are not, they call it in a more empirical way, reductionistic way, or what. Yes, playing with the sublime, that is complexity that comes from necessity to create a very easy and eye-catching picture. But, yes, in the search of simplicity we are arriving at the sublime. Wall: These images are beautiful, and I do love the comment of the sublime, the last one when you ended. So, that these being signals. What I wanted to comment on is another collaboration, 24

The idea is about a condition in cell biology where by blindly positing too much confidence in the presentations of the molecular culture, key cytological knowledge produced by the first-order visuality with therapeutic potential is being overlooked. See Persson and et al 2001. 25 Baudrillard uses the Argentinean tale as an introduction to his work Simulacres et Simulation (1981). Borges original tale is called On Exactitude in Science. Borges 1999. The map/territory relationship, that is the idea that no abstraction could substitute the object in itself was proposed by the scientist and philosopher Alfred Korzybski 1933. Gregory Bateson also expressed a similar concern about the impossibility of knowing the object referent through representations in his book Steps to an Ecology of Mind (Bateson 1972).

186

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

which is the collaboration with the audience and the kind of iconic to the symbolic, that nature and the sophistication of the visual literacy required to read the images. That increases, so that the final image argument before it starts is “Oh, what I see is a cell.” It then fades to this network of signals which instantly someone who has the correct literacy can see as a network of patterns: “I can see the pattern, but I can’t read it.” The literacy is required, what seems like a patterned way from the beginning to the end as well as the nature of the people involved. Serpente: There is a changing nature of the audience, too. And they need to address that with the images. One of the reasons why Molecular Biology of the Cell was born is the authors thought, the previous cell biology was boring. That was what I got from the interviews, that was what they went against. They wanted to pay it another public analysis, and they were successful. When I was discussing it with Costis, he said that was the book. Perini: I am just asking for clarification. What is at the heart of second-order visuality? Is it the fact that it is indirect, that we are visualizing things we can’t see, we have no visual access to our perceptional system? Does it have to do with the style and the interpretative requirements of the visual representation that are produced? So, something that is more traditionally pictorial and iconic would encounter first-order visualization, and something that is not, would be second order. I am having trouble teasing out which one of these two things is at the heart of secondorder visualization. Serpente: Well, I think it depends how you take the existence of the referent. I think some biochemists believe really that you can see, that reductionism is the engine-kind of production, and there is nothing wrong with that. I think they deeply believe that they can picture, they see it as a representation, a mimetic approach. They believe you can always compare and can always relate it to the level of the cell by an image. Perini: The resemblance could be a relationship of visual resemblance because we can see the picture, it has visual properties. But what is being represented through the picture is not something we can ever have a visual experience of. If the biochemist, you know there are analyses of pictures, they might have some assumptions, they might talk about the pictures that way, and so they may say those kinds of things. But I guess this is a question for you and the rest of us; What we are supposed to extract out of that, the second-order visualization is it. Is the question you are asking, is this a way of making visual representation about things that we can’t see and so we might not have the right epistemic connections between the referents and the output representation? The biochemist is sort of taking that connection for granted, but we shouldn’t? Is this it? Serpente: No, I don’t want to give any kind of comments like we should or we shouldn’t. I think science is all about that, doing assumptions, special assumptions about something you can’t say or can’t see. Some people take it as problematic, like relating those images where the referent is not there, like contradicting the image as science. I myself don’t, I think that is fine, we are representing, we are presenting, and that’s all what science is about. Caianiello: I think you are at the real watershed, there is more on this kind of new realism of simulation. Of course, there is a new order of realism. I don’t know whether there will be a time when one can do the same to a cell as Brian is doing with an embryo, to see a different level. Serpente: There are some moves toward that now, movies… Papayanotou: From a scientist’s point of view, I think, what is important with these images it is not whether an enzyme, for example, is presented as a circle, or rectangular, it is blue, or green, or white. What is important is the information that there is this enzyme and that enzyme, and they do that job. And it is an easy way of seeing it and understanding it.

187

Norberto Serpente

Serpente: But that job isn’t there. I mean I won’t see that job, I see who interacts with whom, I see probably this band in a gel and so on, and the kind of things, but I don’t see what they are doing with each other. Kerfeld: I just want to add, this is really iconic of modern biology, where we have high output of proteins, where we have databases, and that’s how you are looking at the data, and also reflect about them. Thieffry: I think that we look at the data is more like a view to show that it is very complex. And when we look at it, it is more for inquiries on the computer. This is not conveying any specific information, it is conveying a very global gist of information telling you that “Yes, there is a high output on the screen, a lot of components, a lot of interactions characterizing it.” That’s it. And maybe, there is some more information with the coloration, but that is just an illustration of that stage. When you really want to dig into the data, then you have to use the computer. Burian: Very important, the new range of interpretation is about the three-dimensional interaction of chromosomes as default, in which things which have been seen but sometimes not understood at all. Now these things come together, meet, and do things in a particular occasion. And it is an entirely other sort of issue and the representation of this. And that is also in the images you have shown, which are now represented in some textbooks. Serpente: In such texts, some cartoons are repeated in different things, so the same molecule is represented by the same cartoon on page 10 and page 304.

188

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

References Alberts, Bruce, Alexander Johnson, Peter Walter, Julian Lewis, Martin Raff and Keith Roberts. 2008. Molecular Biology of the Cell New York/London: Garland Publishing. Alberts, Bruce, Dennis Bray, Julian Lewis, Martin Raff, Keith Roberts and James D. Watson. 1983. Molecular Biology of the Cell. New York/London: Garland Publishing. Alberts, Bruce, Dennis Bray, Julian Lewis, Martin Raff, Keith Roberts and James D. Watson. 1989. Molecular Biology of the Cell. New York/London: Garland Publishing. Bateson, Gregory. 2000 [1972]. Steps to an Ecology of Mind. Chicago: University of Chicago Press. Baudrillard, Jean. 1981. Simulacres et Simulation. Paris: Galilee. Baudrillard, Jean. 1993. Symbolic Exchange and Death. London: Sage Publications. Baudrillard, Jean. 1999. L’Echange Impossible. Paris: Galilee. Bechtel, William. 2006. Discovering Cell mechanisms: The Creation of Modern Cell Biology. Cambridge: Cambridge University Press. Borges, Jorge Luis. 1999. Collected Fictions. Middlesex: Penguin Books. Brauckmann, Sabine. 2006a. A Network of Tissue Culture and Cells: The American Society for Cell Biology. Archives Internationales D’Histoire des Sciences. 56, (156-157), 295-308. Brauckmann, Sabine. 2006b. Networks of Tissue Knowledge. Bull. Hist. Epist. Scie. Vie. 13, (1), 33-52. Canguilhem, Georges. 1965. La Connaissance de la Vie. Paris: Vrin. Chandler, Daniel. 1997. Semiotics: the Basics. London: Routledge. Darnell, James E, Harvey E. Lodish and David Baltimore. 1989. Molecular Cell Biology. New York: Scientific American. Daston, Lorraine and Peter Galison. 2007. Objectivity. New York: Zone Books. de Chadarevian, Soraya and Hemmke Kamminga (Eds.). 1998. Molecularizing Biology and Medicine: New Practices and Alliances, 1910s-1970s. Amsterdam: Harwood Academic Publishers. de Chadarevian, Soraya and Nick Hopwood. 2004. Models: the Third Dimension of Science. Stanford: Stanford University Press. De Duve, Christian. 1984. A Guided Tour of the Living Cell Vol I. New York: Scientific American Library and Rockefeller University Press. De Robertis, Eduardo D. P., and Eduardo M. F. De Robertis Jr. 1987. Cell and Molecular Biology (8th edition). Philadelphia: Lea & Febiger. Eco, Umberto. 1976. A Theory of Semiotics. Bloomington: Indiana University Press. Florkin, Marcel. 1972. “A History of Biochemistry”. In: Florkin, Marcel and Elmer Henry Stotz (Eds.) Comprehensive Biochemistry Vol. 30. Amsterdam: Elsevier.

189

Norberto Serpente

Fruton, Joseph S. 1992. A Skeptical Biochemist. Cambridge, Massachusetts/London: Harvard University Press. Giot, L., et al. 2003. “A Protein Interaction Map of Drosophila Melanogaster”. Science 302: 1727-1736. Hacking, Ian. 1983. Representing and Intervening: Introductory Topics in the Philosophy of Natural Science. Cambridge: Cambridge University Press. Harris, Henry. 1999. The Birth of the Cell. New Heaven/London: Yale University Press. Hillman, Harold, and Peter Sartory. 1980. The Living Cell: A re-examination of its fine structure. Chichester: Packard Publishing Limited. Hooke, Robert. 1665. Micrographia: or, Some physiological descriptions of minute bodies made by magnifying glasses. With observations and inquiries thereupon. London: J. Martyn and J. Allestry. Hoopes, J. (Ed). 1991. Peirce On Signs: Writings on Semiotics by Charles Sanders Peirce. Chapel Hill: UNC Press. Hughes, Arthur. 1959. A History of Cytology. New York/London: Abelard Schuman. Judson, Horace Freeland. 1996. The Eighth Day of Creation: Makers of the Revolution in Biology. Cold Spring Harbor: CSHL Press. Kay, Lily E. 1993. The Molecular Vision of Life: Caltech, The Rockefeller foundation and the Rise of the New Biology. New York: Oxford University Press. Keller, Evelyn F. 2002. Making sense of life: Explaining biological development with models, metaphors, and machines. Cambridge: Harvard University Press. Klein Ursula. 2002. Experiments, Models, Paper Tools: Cultures of Organic Chemistry in the Nineteenth Century. Stanford: Stanford University Press. Knorr-Cetina, Karin D., and Klaus Amann. 1990. ‘Image Dissection in natural Scientific Inquiry’. Science Technology & Human Values. 15, 3: 259-283. Korzybski, Alfred. 1933 [1995]. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics. San Francisco: Institute of General Semantics. Lynch, Michael and Steve Woolgar (Eds.). 1990. Representation in Scientific Practice. London: The MIT Press. Lynch, Michael. 1990. “The Externalised Retina: Selection and mathematisation in the visual documentation of objects in the life sciences”. In: Lynch, M. and S. Woolgar (Eds.). Representation in Scientific Practice. Cambridge Massachusetts: The MIT Press. Lynch, Michael. 1994. “Representation is Overrated: Some Critical Remarks about the Use of the Concept of Representation in Science Studies”. Configurations 2(1): 137-149. Maienschein, Jane. 1991. From Presentation to Representation in E. B. Wilson’s The Cell. Biology and Philosophy. 6: 227-254.

190

Visualising the Invisible in Cell Biology: The New Face of the Cell from the Late 1980s

Maxwell, Grower. 1962. “The ontological status of theoretical entities”. In: Feigl, H. and Grower Maxwell (Eds). Scientific Explanation, Space and Time, vol 3 of the Minnesota Studies in the Philosophy of Science. Minneapolis: University of Minnesota Press. Morange, Michael. 1998. A History of Molecular Biology. Cambridge, MA/London: Harvard University Press. Morrison, Margaret. 1999. “Models as Autonomous Agents”. In: Morgan, Mary S. and Margaret Morrison (Eds.) Models as Mediators. Perspectives on Natural and Social Sciences. Cambridge/ New York: Cambridge University Press. Palade, George E. 1952. “A study of fixation for electron microscopy”. J. Exp. Med. 95: 285-298. Pauwels, Luc. 2006. “A Theoretical Framework for Assessing Visual representational Practices in Knowledge Building and science Communications”. In: Pauwels, Luc (Ed.) Visual Cultures of Science: Rethinking Representational Practices in Knowledge Building and Science Communication. Hannover/London: Dartmouth College Press, University Press of New England. Persson, Carl G. A., Jonas S. Erjefalt, Lena Uller, Morgan Andersson and Lennart Greiff. 2001. “Unbalanced Research”. TRENDS in Pharmacological Sciences 22(10): 538-541. Porter, Keith R., Claude Albert and Ernest F. Fullam. 1945. “A Study of Tissue Culture Cells by Electron Microscopy: Methods and Preliminary Observations”. Journal of Experimental Medicine 81(3): 233-246. Rasmussen, Nicolas. 1993. “Facts, Artifacts and Mesosomes: Practising Epistemology with the Electron Microscope”. Stud. Hist. Phil. Sci. 24(2): 227-265. Rasmussen, Nicolas. 1995. “Mitochondrial Structure and the Practice of Cell Biology in the 1950s”. Journal of the History of Biology 28: 381-429. Rasmussen, Nicolas. 1997. Picture Control: The Electron Microscope and the Transformation of Biology in America, 1940-1960. Stanford: Stanford University Press. Rheinberger, Hans-Jörg. 1995. From Microsomes to Ribosomes: “Strategies” of “Representation”. Journal of the History of Biology, 28: 49-89. Rheinberger, Hans-Jörg. 1997. Towards a History of Epistemic Things: Synthetizing Proteins in the Test Tube. Stanford: Stanford University Press. Roberts, Keith. 2003. “On Drawing Molecules”. In: Inglis, John R., Joseph Sambrook and Jan A. Witkowski (Eds.). Inspiring Science: Jim Watson and the Age of DNA. New York: CSHL Press. Sapp, Jan. 1987. Beyond the Gene: Cytoplasmic Inheritance and the Struggle for Authority in Genetics. Oxford: Oxford University Press. Serpente, Norberto, Cristiana Marcozzi, Roberts Gareth E., Bao Qi, Brigitt D. Angst, Elizabeth M. A. Hirst, Ian D. J. Burdett, Roger S. Buxton and Anthony I. Magee. 1996. “The Regulation of the Expression, Phosphorylation and protein Associations of pp125 FAK during Rat Brain Development”. Molecular and Cellular Neuroscience 7: 391-403. Sismondo, Sergio. 1999. “Models, Simulations, and their Objects”. Science in Context 12(2): 247-260.

191

Norberto Serpente

Watson, James D. 1965. Molecular Biology of the Gene. New York/Amsterdam: W. A Benjamin. Wilson, Edmund B. 1925. The Cell in Development and Heredity. 3rd edition. New York: The Macmillan Company.

192

III Rhetorica Visual Pedagogy

Science Joins the Arts: The Collection of Watercolors and Drawings of Marine Organisms at the Stazione Zoologica Anton Dohrn Christiane Groeben

Introduction Man has always used words to convey knowledge about artifacts, events, processes, and living beings over a distance of space and time. Philostratus’s eikones are a fascinating and striking example of this. The second-century Greek sophist and orator described to a ten-year-old boy 64 paintings he had seen in a villa in Naples. These paintings live and have survived only through his words. It had been a challenge for him to recreate a work of art through the art of words. Today we have many tools and means to show what we are talking about. We don’t have to rely only on our capacity to recreate images through our words. In science, images and words should never be in competition – they offer different value at different times, but in the end they should always be complementary. Think of J. Z. Young in this context. In 1929, he extended his research on the epistellar body to cephalopods. He did not find an epistellar body because cephalopods don’t have them (Young 1985, p. 155). His thoughts, however, returned to yellow spots he had seen on the slide, and at a certain point, he realized that they were not veins, as he had first thought, but giant nerve fibers. It was a visual aid that led to the discovery of this landmark in neurobiology. And have you ever tried to carefully follow Alexander Kowalewski’s first description of the chorda dorsalis he discovered in 1866 in ascidians? Recently, I had to translate it from German into English for a Japanese guest investigator. Rather an impossible task had there not been the figure illustrating his description. One without the other would have left much too much space for misunderstandings. The techniques and options for documenting one’s findings have changed very much since then, the ethics of choices (should) have not. In this paper I want to investigate why the Zoological Station was interested in helping its patrons to experience the remarkable interaction between text and image and how this was put into practice.

The collection of scientific illustrations The Archives of the Zoological Station hold a collection of more than 1,200 scientific drawings in pencil, ink, and watercolor done between 1885 and 1930. Almost all of the drawings are signed, and the organisms are sometimes identified by the authors. Most of them are also dated either by the artist or by an entrance stamp of the Stazione Zoologica. There seems to be no obvious explanation or unifying concept for this collection. I found some of the drawings in folders without any apparent order, others were kept in envelopes grouped by species, others still in albums arranged by subjects. And still more are cut out and pasted on cardboard, apparently ready for publication. The drawings had somehow survived at the institute before being passed on to the Historical Archives in 1969. Shortly before that date, they had passed through the hands of Ugo Moncharmont, long-time collaborator and expert in marine biology and of the Gulf of Naples in particular. To

195

Christiane Groeben

him we owe the identification of many illustrations; he also updated the nomenclature and added information about eventual publication. About 10 years ago, the collection became the topic of a post-doctoral thesis with the aim of creating a database. The database has recently been transferred to the software we are now using for the Archives. And a project is under way to allow access to the images as well as to the records. But the question remains: is this just a beautiful treasure we happen to have or is there a particular reason for its existence in this special context? The artists number fewer than ten while over the same period more than 2,500 guest investigators used the facilities of the Naples Station.

The Stazione Zoologica as publisher: The Fauna and Flora By 1878, i.e., five years after the opening of the Naples Station, about 100 scientists had already worked there. The research went smoothly, as did the technical and scientific administration. Dohrn could therefore start to take another step towards the goal and main purpose of the Naples Station, namely to better organize scientific research in zoology (Dohrn 1881, pp. 511-512). In addition to the hospitality offered to guest investigators, Dohrn started a series of publications as a service for the benefit of the broader scientific community. In only three years Dohrn implemented three major serials: the much-appreciated in-house journal for staff and guests, Mittheilungen aus der Zoologischen Station Neapel, the reference journal Zoologischer Jahresbericht, and, in 1880, the series of monographs Fauna und Flora of the Gulf of Naples. Carl Chun’s study on the Ctenophores of the Gulf of Naples was the first volume in a series of forty published between 1880 and 1982 (for a complete list see: Groeben 1975, pp. 64-68). It is on the Fauna and Flora series that I shall concentrate here. Dohrn’s objectives were as follows: 1. To create a basic work for marine biology 2. To give some order into the terribly confused nomenclature 3. To identify the species through illustration and description [my emphasis] (Dohrn 1885, p. 133) Each taxonomic unit was to be treated in all its aspects (morphological, developmental, systematic, ecological) and had to include all references to previous studies.

Experts Dohrn knew that such comprehensive studies needed experts, funding, and efficient coordination. He carefully selected the experts from among the younger guest investigators and offered them contracts as assistants. This guaranteed the monographer financial security and permanent access to research material. For younger scientists such a monograph was a safe passport to tenure. Of the first twenty-one monographs, six were authored by temporary staff members, four by Dohrn and his three permanent assistants, and eleven by short-term guest investigators.



Dohrn has often been defined as a gifted manager of science. Before starting a new activity he carefully weighed all pros and contras. Do ut des was his motto for whatever he did, i.e., you can only obtain what you want if you offer something equally important to your partner. This had worked fine for the table or bench system. You paid for a service (or a bench) and could expect it to be good. The same principle was applied to the Fauna and Flora project.

196

Science Joins the Arts: The Collection of Watercolors and Drawings of Marine Organisms ...

An agreement, signed by both parties, defined the duration of the collaboration, financial support, and working conditions; and three jurors (Schiedsrichter) were always nominated, e.g., Theodor Boveri, Eugen Korschelt and Arnold Lang for Julius Wilhelmi’s monograph on Triclades (Wilhelmi 1909). Later, artistic assistance was also guaranteed.

Funding Dohrn had it all figured out before departure: he knew that such an enterprise would cause an enormous strain on the Stazione’s budget, but he also knew that perfection has its price. He therefore decided to sell the monographs on a subscription basis – subscribers committed themselves for a three-year (later five-year) period. Paying 50 Marks annually, they received at least two monographs per year. The first year consisted of two volumes with a market value of 100 Marks, the second year two volumes with a market value of 72 Marks, the third year five volumes with a bookseller’s price of 171 Marks. The first volume was ready by the end of June 1880, and the plates even earlier. Together with a collection of preserved animals, they were the real attraction at the Fisheries exhibit in Berlin in April 1880. By January 1881, Dohrn already had 140 subscriptions. With 250-300, he said, he would be on the safe side. In the second year, the number went up to 233, including 14 members of the royalty and a certain Charles Darwin, Esq., Kent. Fourteen years and twenty monographs later, the total number of subscriptions went down to 199 which made Dohrn reduce the number of printed copies from 530 to 415 and again to 315 shortly before World War I.

Management Production was complex and expensive. The texts were printed in Germany, at first with Engelmann in Leipzig, but when the publisher decided to continue to print Haeckel’s works as well, Dohrn changed to Friedländer in Berlin. The plates were produced by Werner and Winter in Frankfurt, the top German establishment for lithographs. Much effort went into keeping the Fauna and Flora business going through intense correspondence with the publisher, the printer, and the authors. Very often Dohrn had to adapt his publication schedule to new, mostly human, but at times also tragic circumstances. The unexpected death of Giuseppe Jatta prevented the writing of the second volume of his cephalopod monograph (volume 23, 1896), redone much later by Adolf Naef (volume 35, 1928). Some authors made their favorite organisms a lifelong study without ever coming up with the monograph, such as Stephan von Apathy and his Hirudinea. Career priorities also prevented the completion of some commitments. Johann Wilhelm Spengel, for instance, had committed himself to producing three monographs (Balanoglossus, Sipuncoliden, Gephyreen), in the end only one was published. There is evidence of more than 25 announced but never finished monographs. Dohrn knew from experience how important it was to accompany any text with good illustrations, especially in such a relatively new field as marine biology, when new species were discovered all the time and detailed descriptions of the organisms and of many developmental stages were much needed. True-to-nature illustrations offered precious tools to the scientist, but they also had a secondary function, namely to make of the publications nice “coffee-table” books for the noble and nouveau-riche potential subscribers, especially in Imperial Germany. 



The fact that Dohrn commissioned the work and paid for it gave him the right to reasonably manipulate publication schedules and to limit excesses in the texts as well as in illustrations. He strongly complained about the fact that in many cases where a simple, differentiated pencil sketch would have sufficed, a costly color print was often presented (Dohrn 1885, p. 132). Dohrn to MD 20.10.1880, Berlin, Archives of the Stazione Zoologica (ASZN): Bd. 172.

197

Christiane Groeben



198

Figure 1: Terebelloids from the Gulf of Naples and the nearby Sea. Table I. Drawings by Comingio Merculiano, legends by E. Meyer 1889. Archivio Storico Stazione Zoologica Anton Dohrn: T.8. /17.10.

Science Joins the Arts: The Collection of Watercolors and Drawings of Marine Organisms ...

The artists And this is where our collection of scientific illustrations comes in again. Many scientists of Dohrn’s period were also passable artists, necessarily so since one could communicate images only through drawings, but few reached high artistic levels. And for many it was not really a labor of love. Otto Bürger, monographer of volume 22 (1895) on Nemertina, for example, insisted that all his figures should be published because they were all relevant and, he argued, “drawing does not give me such satisfaction as to produce more than what is necessary.” Scientific illustrations not only require skill, they also demand concentration and time, which Dohrn no longer had. He therefore hired Franz Etzold (1859-1929) to help him with his illustrations because Dohrn was planning to write a monograph on selachians (which never came about). Etzold, a student of Rudolf Leuckart, specialized in the representation of macro- and microscopic objects. He did some minor work for Dohrn, but left after 10 months because of lack of work, he said. However, he took artistic memories with him as can be seen in the cephalopod plate he published a couple of years later in the Brockhaus encyclopedia (14th ed., 1893-95). By that time Dohrn had already solved the essential problem of qualified scientific illustration by hiring the Neapolitan artist Comingio Merculiano (1845-1915). We don’t know much about his biography nor about his working style or his work as an artist. Born in Naples, he had received his training at the Institute of Fine Arts. His collaboration with the Naples Station seems to have begun sporadically. The first guide to the Public Aquarium was published in four languages in 1880 with no figures at all. Three years later, an “Atlas” was printed that contained 47 plates with 167 marine animals, in black and white, all done by Merculiano. In 1885 Dohrn hired him permanently, and for almost 30 years Merculiano contributed his skill and talent to making the Fauna and Flora monographs remarkable works not only for their results, but also for the beauty of their plates. In particular he contributed to the monographs on Amphipods (Della Valle, volume 20, 1893), Nermetines (Bürger, volume 22, 1895) and Cephalopods (Jatta, volume 23, 1896). Thirteen years later another Neapolitan artist, Vincenzo Serino (1876-1945), joined the staff. Among his teachers at the Institute of Fine Arts were important names such as Morelli, Palizzi, and Vetri. He continued a family tradition: his father Alfonso was himself an artist and had a printing business in Naples. Some of young Serino’s works (which I have not yet located) have been exhibited in Italy and abroad. In particular he collaborated with the authors for the monographs on Mytiliden (List, volume 27, 1902), Phoronis (Selys-Longchamops, volume 30, 1907) and the Triclades (Wilhelmi, volume 32, 1909). After Anton Dohrn’s death in 1909, his son Reinhard seems to have taken up several new projects for other monographs, most of them unpublished, probably because of the war. He also hired 25-year-old Francesco Manzoni to assist Merculiano whose eyesight was diminishing. Manzoni did some lovely watercolors, supposedly for a special project, but not for any monograph. He left the Station in 1913 because there was no need for two artists, Dohrn said. To hire the artists as staff members and to have their services permanently available was certainly a wise decision. They worked with the living animals in the lab or on the boat during excursions, using a microscope and a magnifier. They worked for guest investigators and also for monographers who were no longer in Naples. In co-illustrated monographs, the color plates are almost exclusively by Merculiano or Serino. There is unfortunately no evidence about the way they interacted with their patrons.   

Bürger to Dohrn, 17.2.1894, ASZN: A, 1894, B. From the third edition (1890) on, the figures were inserted in the main text and have been so since then. However, their collaboration was not limited to the Fauna and Flora monographs. It included papers in

199

Christiane Groeben

The artistic heritage Our collection of scientific drawings consists of slightly more than 1,200 items by nine authors. The winner is Serino with 671 illustrations, followed by Merculiano with 477, and Francesco Manzoni with just 29 watercolors. There is evidence that the artistic output of the artists was considered some kind of database: the big albums, first used by Merculiano, were later turned upside down and used by Serino; drawings have been removed and/or replaced; there are cardboards with a series of developmental stages of which some are cut out. We also know that collections were given away to potential authors or for publication, e.g., to the Italian Encyclopedia Treccani (1913[?]). Series of drawings of ascidians by Merculiano were given to Charles Julin (volume 29 in 1904); Max Rauther asked in 1925 whether he was supposed to return the plates with the Merculiano and Serino originals, published in his monograph on Sygnathidae; Otto Bürger apologized for having cut out drawings by Max Heinze that he was supposed to return to Naples. The Russian zoologist Eduard Meyer returned a collection of nine plates on Terebellidae to Naples, and recently seven anatomical drawings by Comingio Merculiano were donated to the archives by Nikolas Waser from Riverside, California. How they came into his possession is still a mystery. This limited, first information leads to the conclusion that the unpublished drawings and watercolors in our collection are only the tip of an iceberg – the leftovers of the total production of the professional artists at the Naples Station. I have now become curious and shall certainly dedicate more research to locating single illustrations and collections originated by “our” artists and to integrating our mostly unpublished collection with information on published illustrations and the whereabouts of the originals. I would also very much like to know how the artists felt about their work: Was it just a duty they had to perform? Did they consider the illustrations works of art, products of their individual creativity? Was it difficult for them to let them go? Were they involved in the publication process? They certainly were artists because of their talent and skill for transferring the physical, finite physiognomy of a living organism into an eternal, reproducible visual aid.

Conclusions The Fauna and Flora project set a landmark in the institutional policy of the Zoological Station. Dohrn saw the need, envisioned the solution, and had the ability and means to implement it. He offered scientists the opportunity to accomplish a major scientific work, and he rendered a service to science through this diffusion of knowledge. The artists were part of the service offered to the authors. They belonged to the Stazione’s invisible technicians, somewhere in the production line between the author – or the fishermen – and the printed result. In conclusion I would like to mention Ilona Richter, another great artist and illustrator who dedicated an important part of her production work to the last two Fauna and Flora monographs, no 39 by Anita Brinkmann on Anthomedusae (1970) and no. 40 by Schmekel and Portmann on the Opistobranchs. Unfortunately for our collections, she felt strongly about her work and took the originals with her to Hungary. This was also the end of the project, not so much because we lack artists; I guess more because there is no longer a way to be granted the time necessary to write another monograph in the Fauna and Flora style.

the Mittheilungen, the Aquarium guides, books and others, such as the results of Krupp’s deep sea campaigns around Capri and a 1902 publicity poster.

200

Science Joins the Arts: The Collection of Watercolors and Drawings of Marine Organisms ...

Bibliography Dohrn, A. 1881. “Bericht über die Zoologische Station während der Jahre 1879 und 1880”, Mittheilungen aus der Zoologischen Station zu Neapel, 2: 495-514. Dohrn, A. 1885. “Bericht über die Zoologische Station während der Jahre 1882-1884.” Mittheilungen aus der Zoologischen Station zu Neapel, 6: 93-148. Groeben, C. (ed.). 1975. The Naples Zoological Station at the Time of Anton Dohrn, in collaboration with Irmgard Müller, Naples, [Stazione Zoologica di Napoli]. Wilhelmi, J. 1909. Tricladen (Fauna und Flora, 32). Berlin: Friedländer. Young, J. Z. 1985. “Cephalopods and Neuroscience”, Biological Bulletin, 168 (supplement): 153-158.

201

Visualizing Sexual Differentiation: The Rhetoric of Diagrams Shelley Wall

This brief paper considers the implications of the rhetoric of visual diagrams used for patient education, and specifically for education about disorders of sex development (DSD). The context for these reflections is my work as a medical illustrator and my experience in creating a Webbased clinical counseling tool about DSD (also referred to as intersex conditions) for The Hospital for Sick Children in Toronto, Canada. There has been much discussion and debate about the words used to designate DSD (Hughes et al. 2006). These conversations have addressed both the descriptive accuracy of competing terms, and those terms’ social, psychological, political, and emotional connotations: their ability to stigmatise or to empower. Like verbal language, visual language – even didactic, ‘factual’ visual language – carries latent as well as manifest content, and can shape social attitudes and selfperceptions. Visual images are widely used in patient education – and in medical education – to enhance spoken or written explanations. Visual paradigms, like terminology, can therefore play a role in influencing attitudes toward a given medical condition. In the case of illustrations about disorders of sex development, where the need for non-stigmatising communication is crucial, it is especially important to consider the implicit messages conveyed by the imagery and compositional strategies used in educational material. For example, the common genre of the documentary clinical photograph – the naked, vulnerable patient, standing in the ‘anatomical position’, eyes hidden behind a black bar – has been critiqued as both ethically questionable and symbolically harmful (for example, Dreger 2000; Preves 2003). Not only were such photographs, historically, often taken or disseminated without the patient’s consent, but the image’s compositional structure itself positions the intersex person as a powerless, dehumanised object incapable of returning the gaze. The aesthetic model referenced by this kind of composition is the catalogue specimen, with units of measurement and standard positioning for ease of comparison. On the other side of the ledger, some illustrations employed in DSD patient education and advocacy explicitly model strength, social integration, and well-being by representing clothed, smiling people (see, for example, Warne 1997). The aesthetic models referenced by such images include portraiture, fine art, street photography, or superhero comics. They portray intersex people as active agents in their own lives, interacting with others or meeting the viewer’s gaze. The primary emotional valence of images like these is relatively easy to decipher, since these are pictures of people, clearly unhappy or content, clearly abject or empowered. Less obviously, schematic diagrams, too, operate on an affective as well as a cognitive level, encoding specific interpretations of, and attitudes toward, the information depicted. The rhetoric of diagrams is the primary topic I wish to broach in this paper. The use of paired or multiple repeating images representing ‘normal’ and ‘abnormal’ morphology is a standard convention in medical illustration: for example, a diagram of a ‘normal’ heart might be shown beside those of hearts exhibiting patent ductus arteriosis and atrial-septal defect, or a diagram of a ‘normal’ knee-joint next to one of a knee marked by arthritis. When we see two or more similar images side-by-side, we can perceive patterns and begin to construct 

www.aboutkidshealth.ca/HowTheBodyWorks/Sexual-Differentiation.aspx? articleID=6850&categoryI D=XS-nh3 (accessed April 30, 2009).

203

Shelley Wall

classificatory systems. For instance, to understand an unusual anatomy we must first understand what it is a variation from, and within that gap the difference becomes visible. Separately, each image represents a body with a particular anatomy; together, they measure out the differential between ‘normal’ and ‘abnormal’. How, I wondered, might this very common visual syntax play out in illustrations about disorders of sex development? Given that a visible deviation from ‘normal’ genital anatomy is frequently the first clue to the existence of a DSD, and given that atypical genital anatomy is often viewed as pathological, I suspected that comparative illustrations of this kind might be used in educational material for patients, parents, and even medical practitioners. A survey of educational illustrations about DSD available on the Web, in peer-reviewed journals, and in standard embryology texts reveals, in fact, an adaptation of the normal/abnormal, paired-image strategy. When it comes to genital anatomy, of course, there are two ‘normals’. Just how these ‘normal’ anatomies are defined is a contentious issue in itself; but what is accepted as given, in the visual syntax of sex development, is that female ‘normal’ and male ‘normal’ are diametrically opposed. They form the extreme and idealised binary pair that becomes the dual reference point for all other anatomies. A common result, when it comes to visual representations of sex development, is a tripartite diagrammatic structure with female and male anatomy in opposition and the common anlage represented at the apex of a triangle that reads from top to bottom, or at the origin of a branching timeline that scans from left to right. Variations on this structure are used to describe genital, gonadal, and internal-organ development. Now, this might seem like the obvious way to represent differentiation, but it comes with an underlying message. The compositional movement is centrifugal, tending toward the greatest possible polarity. What lies in between is invisible; and what lies in between is the body of a person with DSD. Intersex children, adults, their parents, and caregivers can see that body as falling into this blank no-man’s-land (pun intended), and thus the implication of pathology is reinforced. In some cases, an intersex condition is represented as a literally inter-mediate term, an abnormal between normals. In creating an illustrated, Web-based, clinical counselling tool about DSD (Wall et al. 2008), I was able to use digital animation to experiment with alternative depictions of sexual differentiation, and to represent sexual anatomy as a continuum of difference, rather than as strictly dimorphic. For example, an interactive ‘slider bar’ can allow viewers to control animations of genital embryogenesis and of degrees of virilisation in XX babies, and thus to see a range of anatomical variation in addition to typical female and male phenotypes (Fig 1).



204

Figure 1: Screen-captures of an interactive Prader scale used to describe degrees of virilization in 46, XX individuals, copyright The Hospital for Sick Children; Toronto, Canada.

Visualizing Sexual Differentiation: The Rhetoric of Diagrams



Figure 2: Screen-captures of a prototype interactive visualization of biosynthetic pathways; copyright Shelley Wall.

Like depictions of genital development, diagrams of the physiological cascades that regulate human embryonic sex differentiation often take the form of flow-charts depicting two parallel tracks, female and male; at branch points representing instances of gene expression or hormone secretion, the developing embryo shunts onto one phenotypic track or the other. Such diagrams may be used in student textbooks, for example, to explain ‘normal’ sex differentiation, or in patient education to explain disorders of sex development. But there is a trade-off between a static flow-chart’s clarity and legibility, and its capacity to depict the full complexity of genetic and hormonal interactions. Moreover, a two-track flow-chart cannot represent the range of variation possible in sex development, except to show DSD as ‘derailments’ of the normal process.

205

Shelley Wall

Once again, Web-based animation offers the opportunity to create non-linear, interactive visualizations that can change dynamically to depict different scenarios: the presence or absence of a given receptor, for example, or the effect of a specific enzyme deficiency in a biosynthetic pathway. Such dynamic visualizations are not only potentially more information-rich than static diagrams; they also reconceptualize the process of sex differentiation as a network of variation, rather than a strictly binary process. For example, the layout for a prototype animation on which I am currently working is based on the metaphor of a field of possibilities, rather than of binary tracks (Fig. 2). Beginning at the perimeter of the diagram, the user chooses among alternative possibilities (such as fully-functional androgen receptor (AR), partially functioning AR, or absence of AR) arranged in concentric rings. The phenotypic effects of whatever pathway the user constructs are displayed in animations within the circular navigation. Whether the outcome is typical anatomy – such as testes in combination with typical male external genitals – or less common anatomy – such as testes in combination with typical female external genitals – the anatomical image appears in the centre of the non-linear field of possibilities. I propose that we take seriously the implications carried by the illustrations we use to represent DSD; and that we begin to develop alternative visual strategies founded on the principle of visual continuity between typical and variant states. Simple, descriptive diagrams are not generally perceived as ideological; therefore the categories and distinctions they present can seem ‘transparent’ and so the notion that there is some pathological shortfall or failing in a person whose karyotype, endocrine function, or anatomy do not fit the diagrammatic male/female binary is subtly reinforced. Particularly in the case of external anatomy, this polarizing graphic tendency could deepen a person with DSD’s sense that their atypical anatomy is not only unacceptable but unrepresentable – creating unneeded additional stress. My modest proposal is to rethink the visual representation of DSD in a way that would reinforce the notion of typical female and typical male anatomy as stages in a more inclusive continuum, rather than the only possible termini of development. ‘Normal’ by any definition is a range of variation; all people, even those requiring medical treatment, should be enabled to see themselves symbolically represented within that category.

Discussion [background voice]: This is really exciting, I studied the history of intersexual representations all the way back to the middle ages and now the contemporary issues of that, and I have not seen anything that takes such an elegant solution in multimedia to presenting that. Otherwise, it is always that kind of like parallel binary positioning, and this is really amazing and turning that into something totally dynamic. I am just kind of wondering about your background a little bit more in terms of being able to see that kind of dynamic relationship of possibilities. In your primary work did you do just still images, or how did you come to using flash models which is a quite elegant way to use that tool for this system? Wall: When I got into medical illustration early on, I thought I was going to use red chalk and hand-made paper and just draw things like da Vinci drew. But that was five hundred years ago, and we don’t do that any more. Instead of very loving hand-drawing, there were opportunities – think it might be different in Europe, but certainly in Canada – there are so many opportunities for multimedia patient education, and people did realize a lot of work has to be done to educate patients about DSD. My background was in English literature and gender theory, so I came to this representation of sexual anatomy from that perspective. And because of the possibility of technology to show networks of differences rather than fixed representations, multimedia seemed to be a natural choice, although coding is not something that comes to me naturally.

206

Visualizing Sexual Differentiation: The Rhetoric of Diagrams

Bibliography Dreger, Alice D. 2006. “Intersex and human rights: The long view”. In Sharon E. Sytsma (ed.) Ethics and Intersex. Dordrecht: Springer, pp. 73-86. Hughes, Ieuan A., Christopher P. Houk, S. Faisal Ahmed, Peter A. Lee, and LWPES1/ESPE2 Consensus Group. 2006. “Consensus statement on management of intersex disorders”. Archives of Disease in Childhood 91(7): 554-563. Preves, Sharon E. 2003. Intersex and Identity. Chapel Hill : Longleaf Services. Wall, S., M. J. Wiley, B. J. Neilson, J. Jenkinson, G. Tait, and D. J. Bägli. 2008. Designing a webbased clinical counseling tool about disorders of sex development. Journal of Biocommunication 34(1): E12-E17. Warne, Garry L. 1997. Complete Androgen Insensitivity Syndrome. Victoria: Department of Endocrinology and Diabetes Royal Children’s Hospital.

207

Authors and Editors

Thomas Brandstetter, Dr., Institute for Philosophy, University of Vienna, Universitätsstr. 7, A-1010 Vienna, Austria Christina Brandt, Dr., Max Planck Institute for the History of Science, Boltzmannstraße 22, 14195 Berlin, Germany Sabine Brauckmann, Dr., Science Centre, Tartu University Library, Struve 1, 50091 Tartu, Estonia Matthias Bruhn, Dr., Das Technische Bild, Hermann von Helmholtz-Zentrum für Kulturtechnik, Humboldt Universität zu Berlin, Unter den Linden 6, 10099 Berlin, Germany Richard Burian, PhD, Professor of Philosophy and STS, Philosophy Department, 220 Major Williams Hall, Virginia Tech, Blacksburg, VA 24061, USA Silvia Caianiello, ISPF CNR via Porta di Massa 1, 80133 Napoli, Italy Soraya de Chadarevian, PhD, Professor, UCLA Center for Society and Genetics, Box 957221, 1323 Rolfe Hall, Los Angeles CA 90095-7221, USA Ariane Dröscher, Dr., Dipartimento di Biologia evoluzionistica sperimentale (Dep. of Evolutionary and Experimental Biology), Università degli Studi di Bologna, via Selmi, 3, 40126 Bologna, Italy Erna Fiorentini, Dr. Dr., Lecturer, Kunsthistorisches Institut, Freie Universität Berlin, Koserstr. 20, 14195 Berlin, Germany Maura C. Flannery, PhD, Director, Center for Teaching and Learning, St. John’s University, 8000 Utopia Parkway, Queens, New York 11439, USA Christiane Groeben, Dr., Stazione Zoologica Anton Dohrn, Villa Comunale 1, I-80121 Napoli, Italy Marianne Klemun, PhD, Professor, Institute for History, University of Vienna, Dr.-Karl-LuegerRing 1, A-1010 Vienna, Austria Brian D. Metscher, PhD, Assistant Professor, Department of Theoretical Biology, University of Vienna, Althanstr. 14, 1090 Vienna, Austria Mait Metspalu, PhD, Research Fellow, Estonian BioCentre, Riia 23n, 51010 Tartu, Estonia Gerd B. Müller, Professor Dr., Department of Theoretical Biology, University of Vienna, Althanstr. 14, 1090 Vienna, Austria Costis Papanayotou, PhD, Research Department of Cell & Developmental Biology, Division of Biosciences, University College London, Gower Street (Anatomy Building), London WC1E 6BT Silver Rattasepp, BA, Institute of Semiotics, University of Tartu, Riia 78, 50410 Tartu, Estonia Stephane Schmitt, Dr., Equipe REHSEIS, Université Paris 7 REHSEIS UMR 7219, Laboratoire de Philosophie et d’Histoire des Sciences, Case 7064, 5 rue Thomas Mann 75205 PARIS CEDEX 13 Norberto Serpente, PhD, Science Centre, Tartu University Library, Struve 1, 50091 Tartu, Estonia Sema K. Sgaier, Dr., Bill & Melinda Gates Foundation, A-10, Sanskrit Bhawan, Qutab Institutional Area, Aruna Asaf Ali Marg, New Delhi, 110067, India

209

Authors and Editors

James Sharpe, PhD, ICREA Research Professor, EMBL-CRG Systems Biology Unit, Centre for Genomic Regulation (CRG), Dr. Aiguader, 88, 08003 Barcelona, Spain Denis Thieffry, PhD, Professeur, TAGC-INSERM ERM206, Université de la Méditerranée, Campus Scientifique de Luminy Case 928, 13288 Marseille Cedex 09, France Marion Vorms, MA, IHPST (Université Paris 1), 13 rue du Four, 75006 Paris, France Shelley Wall, PhD, Lecturer, Biomedical Communication, Institute of Medical Science, University of Toronto, 1 King’s College Circle, Toronto, ON M5S 1A8, Canada

210

Max-Planck-Institut für Wissenschaftsgeschichte

Max Planck Institute for the History of Science

Preprints since 2008 (a full list can be found at our website)

340 U l j a n a F e e s t , G i o r a H o n , H a n s - J ö r g R h e i n b e r g e r, J u t t a S c h i c k o r e , F r i e d r i c h S t e i n l e ( e d s . ) Generating E xperimental Knowledge 341 S í l v i o R . D a h m e n Boltzmann and the ar t of flying 342 G e r h a r d H e r r g o t t Wander er -Fantasien. Franz Liszt und die Figur en des Begehr ens 343 C o n f e r e n c e A Cultural H istor y of Her edity IV: Her edity in the Centur y of the Gene 344 K a r i n e C h e m l a Canon and commentar y in ancient China: An outlook based on mathematical sour ces 345 O m a r W. N a s i m Obser vations, Descriptions and Drawings of Nebulae: A Sketch. 346 J u l i a K u r s e l l ( e d . ) Sounds of Science – Schall im Labor (1800–1930) 347 S o p h i a Va c k i m e s The Genetically Engineer ed Body: A C inematic C ontext 348 L u i g i G u e r r i n i The ‘Accademia dei L incei’ and the New World. 349 J e n s H ø y r u p Über den italienischen Hinter gr und der Rechenmeister -Mathematik 350 C h r i s t i a n J o a s , C h r i s t o p h L e h n e r, a n d J ü r g e n R e n n ( e d s . ) HQ-1: Confer ence on the Histor y of Quantum Physics (Vols. I & II) 351 J o s é M . P a c h e c o Does mor e abstraction imply better understanding? (“Apuntes de Mecánica Social”, by Antonio Portuondo) 352 J o s é M i g u e l P a c h e c o C a s t e l a o , F. J a v i e r P é r e z - F e r n á n d e z , C a r l o s O . S u á r e z A l e m á n Following the steps of Spanish Mathematical Analysis: Fr om Cauchy to Weierstrass between 1880 and 1914 353 J o s é M i g u e l P a c h e c o C a s t e l a o , F. J a v i e r P é r e z - F e r n á n d e z , C a r l o s O . S u á r e z A l e m á n Infinitesimals in Spain: Antonio Por tuondo’s Ensayo sobr e el Infinito 354 A l b e r t P r e s a s i P u i g Reflections on a peripheral Paper clip Pr oject: A technological innovation system in S pain based on the transfer of Ger man technology 355 A l b e r t P r e s a s i P u i g The Contribution of the Histor y of Science and Social Studies to the U nderstanding of Scientific Dynamics: the Case of the S panish Nuclear Ener gy Pr ogram 356 V i o l a B a l z , A l e x a n d e r v. S c h w e r i n , H e i k o S t o f f , B e t t i n a W a h r i g ( e d s . ) Pr ecarious Matters / Pr ekär e Stof fe. T h e H i s t o r y o f D a n g e r o u s a n d E n d a n g e r e d S u b s t a n c e s i n t h e 1 9 t h a n d 2 0 t h C e n t u r i e s 357 F l o r e n t i n a B a d a l a n o v a G e l l e r Qur’ān in ver nacular. F o l k I s l a m i n t h e B a l k a n s 358 R e n a t e W a h s n e r & H o r s t - H e i n o v. B o r z e s z k o w s k i Die Natur wissenschaft und der philosophische Begrif f des Geistes 359 J e n s H ø y r u p Bar oque Mind-set and New Science. A D i a l e c t i c o f S e v e n t e e n t h - C e n t u r y H i g h C u l t u r e 360 D i e t e r F i c k & H o r s t K a n t Walther Bothe’s contributions to the par ticle-wawe dualism of light 361 A l b e r t P r e s a s i P u i g ( e d . ) Who is Making Science? Scientists as Makers of Technic al-Scientific Str uctur es and Administrators of Science Policy 362 C h r i s t o f W i n d g ä t t e r Zu den Akten – Verlags- und W issenschaftsstrategien der W iener Psychoanalyse (1919–1938) 363 J e a n P a u l G a u d i l l i è r e a n d Vo l k e r H e s s ( e d s . ) Ways of Regulating: Therapeutic Agents between Plants, Shops and Consulting Rooms

364 A n g e l o B a r a c c a , L e o p o l d o N u t i , J ü r g e n R e n n , R e i n e r B r a u n , M a t t e o G e r l i n i , M a r i l e n a G a l a , a n d A l b e r t P r e s a s i P u i g ( e d s . ) Nuclear Pr oliferation: Histor y and Pr esent Pr oblems 365 V i o l a v a n B e e k „Man lasse doch diese Dinge selber einmal spr echen“ – Experimentierkästen, Experimentalanleitungen und Er zählungen um 1900 366 J u l i a K u r s e l l ( H r s g . ) Physiologie des Klaviers. Vo r t r ä g e u n d K o n z e r t e z u r W i s s e n s c h a f t s g e s c h i c h t e der Musik 367 H u b e r t L a i t k o Strategen, Or ganisator en, K ritiker, Dissidenten – Ver haltensmuster pr ominenter Natur wissenschaftler der DD R in den 50er und 60er Jahr en des 20. Jahr hunder ts 368 R e n a t e W a h s n e r & H o r s t - H e i n o v. B o r z e s z k o w s k i Natur wissenschaft und Weltbild 369 D i e t e r H o f f m a n n , H o l e R ö ß l e r, G e r a l d R e u t h e r „Lachkabinett“ und „gr oßes Fest“ der Physiker. Walter Gr otrians „physikalischer Einakter“ zu Max Plancks 80. Gebur tstag. 370 S h a u l K a t z i r Fr om academic physics to invention and industr y: the course of Her mann Ar on’s (1845–1913) car eer 371 L a r r i e D . F e r r e i r o The Aristotelian H eritage in E arly Naval A r chitectur e, fr om the Venetian Arsenal to the Fr ench N avy, 1500–1700 372 C h r i s t o f W i n d g ä t t e r Ansichtssachen. Zur Typographie- und Farbpolitik des Inter nationalen Psychoanalytischen Verlages (1919–1938) 373 M a r t i n T h i e r i n g L inguistic Categorization of Topological Spatial Relations. (TOPOI – Towards a Historical Epistemology of Space) 374 Uljana Feest, Hans-Jörg Rheinberger, Günter Abel (eds.) E pistemic Objects 375 Ludmila Hyman Vygotsky on Scientific Observation 376 Anna Holterhoff Naturwissenschaft versus Religion? Z u m Ve r h ä l t n i s v o n T h e o l o g i e u n d K o s m o l o g i e i m 1 8 . J a h r h u n d e r t (TOPOI – Towards a Historical Epistemology of Space) 377 Fabian Krämer The Persistent Image of an Unusual Centaur. A Biography of Aldrovandi’s Two-Legged Centaur Woodcut 378 José M. Pacheco The mathematician Norberto Cuesta Dutari recovered from oblivion 379 Tania Munz “My Goose Child Martina”. The Multiple Uses of Geese in Konrad Lorenz’s Animal Behavior Studies, 1935–1988