Page Proof Instructions and Queries

1 downloads 0 Views 782KB Size Report
The story begins with the invention of the cloud chamber by Charles Thomson. Rees Wilson (1869–1959), for which he was awarded the Nobel Prize in 1927 ' ...
Page Proof Instructions and Queries Journal Title:

Social Science Information

709099

Article Number: 709099

SSI

Thank you for choosing to publish with us. This is your final opportunity to ensure your article will be accurate at publication. Please review your proof carefully and respond to the queries using the circled tools in the image below, which are available by clicking “Comment” from the right-side menu in Adobe Reader DC.* Please use only the tools circled in the image, as edits via other tools/methods can be lost during file conversion. For comments, questions, or formatting requests, please use . Please do not use comment bubbles/sticky notes .

*If you do not see these tools, please ensure you have opened this file with Adobe Reader DC, available for free at get.adobe.com/reader or by going to Help > Check for Updates within other versions of Reader. For more detailed instructions, please see us.sagepub.com/ReaderXProofs. https://

https://

No.

Query Please confirm that all author information, including names, affiliations, sequence, and contact details, No queries is correct. Please review the entire document for typographical errors, mathematical errors, and any other necessary corrections; check headings, tables, and figures. Please ensure that you have obtained and enclosed all necessary permissions for the reproduction of art works (e.g. Illustrations, photographs, charts, maps, other visual material, etc.) Not owned by yourself. Please refer to your publishing agreement for further information. Please note that this proof represents your final opportunity to review your article prior to publication, so please do send all of your changes now. Please confirm that the Funding statements are accurate.

0010.1177/0539018417709099Social Science InformationMarcovich and Shinn research-article2017

Article

How scientific research instruments change: A century of Nobel Prize physics instrumentation

Social Science Information 1­–27 © The Author(s) 2017 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav https://doi.org/10.1177/0539018417709099 DOI: 10.1177/0539018417709099 journals.sagepub.com/home/ssi

Anne Marcovich

CNRS University of Paris IV, France

Terry Shinn

CNRS University of Paris IV, France

Abstract This article explores structures of intellectual, operational and institutional transformations of scientific research instrumentation in the 20th century. The study of 26 Nobel Prizes in physics instrumentation between 1901 and the present yields abundant and systematic information related to change and stability of instrument function, instrument trajectory and the social organization of instrument-related work. This yields three configurations: ‘bounded’, ‘extensionist’ and ‘linked’. One can observe for the late 20th century the emergence of an unprecedented form of instrumentedrelated cognitive operation that we dub ‘instrument knowledge’. Keywords epistemology, instrument function, instrument knowledge, instrument trajectory, instrument work organization, Nobel Prize, physics instruments Résumé L’article explore les structures des transformations intellectuelles, opérationnelles et institutionnelles de l’instrumentation de recherche scientifique au 20ème siècle. L’étude de 26 prix Nobel en instrumentation physique entre 1901 et aujourd’hui apporte des informations abondantes et systématiques eu égard au changement et à la stabilité de fonction de l’instrument, à la trajectoire de l’instrument et à l’organisation Corresponding author: Terry Shinn, CNRS University of Paris IV, 20 rue Berbier du Mez, Paris 75013, France. Email: [email protected]

2

Social Science Information 00(0)

sociale du travail lié aux instruments. Ceci entraîne trois configurations : ‘délimitée’, ‘extensionniste’ et ‘liée’. On peut observer vers la fin du 20ème siècle l’émergence d’une forme inédite d’opération cognitive reliée-instrument que nous qualifions de ‘connaissance-instrument’. Mots clés épistémologie, fonction instrument, connaissance instrument, trajectoire instrument, organisation du travail instrument, prix Nobel, instruments de physique

In this article we mobilize the Nobel Prizes awarded in physics between 1901 and 2012 to explore transformations in scientific-research instrumentation. It will be argued that, during the first half of the 20th century, instruments tended to serve the requirements of experimentation and theory. In many instances, however, this situation shifted in later decades. It will be shown that, since the mid-20th century, instrumentation has become a scientific knowledge form per se. The promise and power of this turn is at least in part expressed by the capacity of ‘the new instrumentation’ to create novel species of materials and phenomena that subsequently invite experimental and theoretical explorations. In this turn, one observes what we refer to as the emergence of instrument knowledge (Marcovich & Shinn, 2013; von Hippel, 2005). Here the voice of the instrument speaks more loudly than in scenarios of theory or experimentation. This transformation is in part connected to the centrality of the production of materials and phenomena in contemporary physics. Of course, human intelligence remains paramount to the operation of instrument knowledge, as the massage of data continues to be necessary. However, in instrument knowledge, the palpable product generated by the instrument exercises the authority of knowledge. This represents a key epistemological reorientation in science. There exists a plethora of historical and sociological literature dealing with the development of experiment and theory in physics (Barkan, 1999; Jungnickel & McCormmach, 1986; Kragh, 2002; Kuhn, 1978; Nye, 1976; Olesko, 1991; Wheaton, 1983). By contrast one can site relatively little work in the history and sociology of research instrumentation (Baird, 2004; Bromberg, 1991; Butrica, 1996; Choi & Mody, 2009; Elzen, 1986; Galison, 1997; Heilbron & Seidel, 1989; Joerges & Shinn 2001; Mody, 2011; Shinn, 1993, 2008). In this article, we adopt a perspective in which instrumentation is not perceived only as an intermediary step for achieving an exogenous experimental or theoretical end, but is sometimes instead seen as an object of research for its own sake. We will follow transformations of instrumentation in physics across the 20th century. The quantity of instruments developed over this time span is huge. We have chosen to restrict our study to the 26 research instruments in physics designated by the Nobel committee1 from the creation of the Nobel Prize system in 1901 to today (Crawford, 1987, 2002), (cf. Appendix 1).2 Our century-long approach to the historical sociology of instrumentation has permitted identification of key dimensions of research instrumentation where specific instrument patterns emerge and change through these dimensions.

Marcovich and Shinn

3

The first part of the article introduces our descriptive vocabulary and resulting typology, which allows an appreciation of the structure and dynamics of research instrumentation. Three general parameters underpin developments in physics research instrumentation: instrument function, instrument trajectory and the organizational work environment of instruments. The parameter ‘function’ involves three types: detection, metrology and control. For its part, the parameter of ‘instrument trajectory’ also concerns three types: static, branching and massification. Finally, the parameter ‘organizational work environment’ includes the types: autonomous, circumscribed and bureaucratic.3 From this, three dominant instrument configurations emerge: (1) bounded (autonomy, static, metrology); (2) extension (bureaucratic, massification, detection); and (3) linked (circumscribed, branching, control). These configurations prove highly revealing for an appreciation of structures and dynamics of continuity and change in the invention, construction and operations of scientific instrumentation. In the second part of the article, we mobilize these vocabularies for the analysis of three specific Nobel Prizes for instrumentation physics: The phase contrast microscope of Frits Zernike (Nobel Prize 1953) documents the bounded configuration (Van Berkel et al., 1999). The evolution of the cloud chamber (Charles Wilson, Nobel Prize 1927) into the Donald Glaser Bubble Chamber (Nobel Prize 1960), and finally into the Centre Européen de la Recherche Nucléaire (CERN) Big European Bubble Chamber (BEBC) illustrates the extensionist configuration. The story of photon trapping (Serge Haroche and David Wineland, Nobel Prize 2012) casts light on the linked configuration, which constitutes the dominant instrumentation configuration of the Nobel physics awards of our day, and which best characterizes the new epistemology of instrument knowledge. The third part focuses on questions of transformation and continuity in the history of research instrumentation. One observes the relative rise and fall of alternative instrumentation configurations in different historical periods. The conclusion will evoke the idea of ‘instrument knowledge’ as a specific epistemological form.

Conceptual framework: Taxonomies and structures of research instrumentation The organizational environments of instrument conception, design and construction Examining the birth, early development and operation of instrumentation, one finds three principal types of instrument work–organization environment (Whitley, 1980): autonomous, circumscribed and bureaucratic. These types are closely related to instrument size, duration of project, degree of necessary collaboration and budget. Of foremost interest, as will be shown below, one observes the preponderance of alternative forms of instrument environment for different historical moments. However, this profile is not entirely linear or without exceptions. The autonomous instrument-related work environment was relatively common during the early decades of the 20th century (cf. Michelson’s 1907 Nobel Prize for the interferometer; the Zernike phase contrast microscope developed in the 1930s and awarded in 1953, and the Ruska electron microscope invented in the 1930s and given the prize in 1986 – cf.

4

Social Science Information 00(0)

Eric Lettkemann, this issue). The autonomous type of instrument has occurred since, namely with the Scanning Tunneling Microscope (STM) invented in 1981 and awarded the Nobel Prize in 1986 (see Appendix 2). The autonomous instrumentation environment often entails engagement by only a single person, or by a tiny restricted group. Indeed, though the technology and physics may be difficult and complex, the instrument is nevertheless such that it can be managed by one or just a few persons. Operation of the device does not entail a complicated work environment requiring numerous specialists and adjacent human or material inputs. The scientist and the instrument operate largely independently of exogenous management, long forward planning, big budgets, etc. Autonomous thus pertains to relative modesty of human and material resources. In a word, the intellectual and social work of autonomous instrumentation operates at the ‘human level’, where the intervention of big groups and bureaucracy prove unnecessary. In the case of autonomous instrumentation, it is safe to say that the researchers live the instrument, ‘they are the instrument’. They adjust, employ – and sometimes design and construct – their instrument in view of their specific experiment and questions; and as instrument logic and problems change, so do the experiments and vice versa. The distance between the experimenter and their instrument is minimal. It is a marriage. The circumscribed type of instrumentation organizational environment occurs on a moderately larger scale than the above. Examples of such instruments are the Lawrence cyclotron (Nobel Prize 1939), Charles Townes et al.’s oscillators and amplifiers based on the maser-laser principle, or Serge Haroche’s photon-trapping device (Nobel Prize 2012). Note that the achievement of instrumentation associated with the circumscribed organizational environment often resides in the control of phenomena versus metrology, as is the case of much autonomous instrumentation, and likewise is opposed to the detection function which is commonly associated with many of the devices located in the bureaucratic organizational environment (see below). The instruments generated within the circumscribed environment type entail many more actors than those conceived in an autonomous framework. Design, construction and operation of circumscribed devices include a number of scientists in collaboration with docs and post-docs, engineers and technicians. Projects are carried out inside a defined and often close-knit circle of colleagues. Circumscribed work rhymes with the dynamics of teamwork. This circumscribed environment involves a team with internally differentiated competencies. All is not condensed into a single person or a small group, as with the autonomous environment of instrument organization. Hence the planning, scheduling and execution of instrumentation projects in this circumscribed world exceeds the operations associated with autonomous instrumentation and its framework; yet it is nevertheless not so extensive as to demand the differentiated, high-level planning and managerial structures synonymous with bureaucratic giant instrumentation such as particle detectors (to be discussed below). When compared with autonomous devices, budgets of circumscribed environment instruments are much larger, sometimes running into millions of euros. The equipment present in this category of instrumentation is frequently multiform, complex and necessarily integrated. Much design and integration work occurs inside the laboratory proper. Research projects extend over a few years, and maybe more. This contrasts with the often decades-long multifaceted research programs associated with bureaucracy.

Marcovich and Shinn

5

In the case of the type of instruments related to bureaucracy, their large size – for example, the huge particle accelerator conceived, constructed and operated by CERN and the Alma Astronomical Observatory in Chile – requires extended forward planning, huge resources, a vast administration and a mammoth material infrastructure. The operation of the instrument involves many divisions of labor and integration of tasks. The environment commonly mobilizes hundreds or thousands of people for the work of construction, maintenance and operation, and it may associate the efforts of hundreds of scientists (sometimes scattered around the globe, and who may never meet face to face!) (Simoulin, 2012). In the bureaucratic instrumentation environment, national or international programs are set up in a long-term temporality, as already mentioned, extending over decades. The contrast between the bureaucratic instrumentation environment and the autonomous instrument environment is total with respect to almost everything – number of actors, temporality, economic resources, forward planning, management and, last but not least, involvement of national and international objectives. As indicated above, the apparatus of the former is local and structured around small or moderate teams. A note of caution: it would be an error to perceive the circumscribed environment type as intermediate between the autonomous and bureaucratic environments. Each is indeed a world unto itself.

Instrumentation trajectory – three types A proper understanding of novelty and stability in contemporary research instrumentation necessitates a grasp of forms of instrument trajectory as a key element in technical and cognitive development. Trajectory implies considerations of temporality. Temporality constitutes a framework for one measure of instrument stability or transformation. Some instruments remain stable across time despite everything, while others prove quick to adapt to new technological or cognitive horizons. Instrument-development dynamics are very frequently motivated by the introduction of new scientific questions, which are often addressed through the birth of devices that are the extension of extant apparatus. Here something new emerges, grounded on a tested, orthodox instrument. We identify three types of instrument trajectory: static, branching and massification. In the static trajectory, many research instruments are designed and built to address a well-defined task. Static instruments exhibit three characteristics. First, a static instrument evolves but very little with the passage of time. Second, it continues to be used over a long period. Third, by and large, the purview of the instrument’s application remains largely the same. It is conceived and constructed for a particular question domain, and it continues to be used mainly in this same domain. It is impervious to time. It can be seen as an expression of conservatism. This definition of static instrumentation would suggest that most scientific instruments fit here. We have nevertheless detected only a few of them in our study of Nobel Prize physics apparatus. By contrast with static instruments, what we term branching research devices can be defined as part of a chain of instrumentation/question/answer/new technology/new question concerning a family of related phenomena. Branching instruments are best seen as obeying a logic of cycling. This entails the four following steps: (1) What can be considered a foundational referent instrument addresses a research question in a well-defined area – for example the Michelson interferometer for the study of light. (2) Instrument-generated

6

Social Science Information 00(0)

research results enable the conception and development of new instruments. (3)These new instruments in turn inspire a chain of additional questions. Branching instrumentation is synonymous with affiliations. (4) Finally, these questions stimulate the design and construction of newer devices. Branching instruments can be seen as an expanding spiral of new questions, new solutions and new devices in associated domains. The true significance of the branching instrumentation trajectory lies in its strong connection with the development of novel research fields. To repeat, this cascade of new questions and new related instruments is the concretization of the germination of new fields that certainly lies at the heart of much contemporary scientific research (Bromberg 1991). By better understanding these branching chains of questions and instrumentation, we decipher one of the key structures of today’s scientific learning. Branching instruments are strongly represented in Nobel Prize laureates of the second half of the 20th century,4 and it is through their study that the importance of branching apparatus in the birth of new fields powerfully emerges. A complete appreciation of instrument trajectory requires inclusion of a third and final type. We term this massification. Massification refers to an important increase in size of existing devices, where this leads to the discovery of new phenomena and introduction of new fields (Butrica 1996; Farge, 2012; Galison, 1997; Heilbron & Seidel 1989; Krige, 1996; Pestre, 1992; Simoulin, 2012, 2016). These instruments are not only bigger in size, they are also more complex. This has important consequences for the category of organizational environment. While the logic of branching is instrument diversity, by contrast the underlying principle of massification is continuity along the same line.

Instrument functions Finally, and perhaps most importantly, what do instruments actually do? Of course they contribute to theory as explanation of the natural world; and they provide the data of experimentation (Bachelard, 1938; Hacking 1983; Hoddeson et al., 1992; Katzir et al. 2010; Mitchell 2012; Olesko 1991). The question remains, though: How can one best describe the activities proper to instruments? In the Nobel physics prize instrumentation population, different instrument functions emerge more powerfully at particular moments of history, and the association between instrument function, trajectory and environment likewise transforms over the long hundred years that we studied. Nobel Prize physics instruments reveal three major types of function: detection, metrology and control. Note that what might erroneously be considered a fourth instrument function exists, namely: simulation. Simulation does not constitute a differentiated instrument function, but instead operates transversely across the three above-signaled functions. Of utmost interest is that today simulation is the most-cited instrument in the Science Citation Index (cf. the article by Johannes Lenhard in this issue and Humphreys, 2004, Lenhard et al., 2007; Shinn, 2007; Winsberg, 2010). Detection refers to the identification of a presence – be it a particle, a force, or a phenomenon. In the annals of Nobel Prize physics instruments, this type of instrument function deals, for example, with particle disintegration and cosmic rays. By contrast, the metrology function includes matters of distance, space, form, weight, temperature, intensity … Dimensions are sometimes expressed as visual representations – graphs, tables

Marcovich and Shinn

7

and also images. Images provide information about position, shape and relation between positions, and they offer a simultaneous grasp of the parts and the whole (Marcovich & Shinn 2014: Ch. 4; Shinn, 1987). Images can present a great amount of information and at the same time integrate that information (de Chadarevian & Hopwood, 2004; Hess, 1966; Morgan & Morrison, 2008). This instrumentation function often favors a deterministic epistemology. Finally, what of the type of function termed instrument control? Generally speaking, control can be understood as having an objective and as having the means to supervise the processes necessary to fulfill the objective. It is the intention and capacity to obtain an anticipated outcome. In physics as also in engineering, control is achieved through an instrument. Our understanding of control includes the capacity to produce new phenomena, which has been emphasized by Hacking in his book Representing and Intervening: Introductory topics in the philosophy of natural science (Hacking, 1983). The capacity to generate new phenomena is well exemplified by epitaxy, which is the tailoring of artificial materials (Marcovich & Shinn, 2014), and by lasers and their power to create events that do not exist in nature. Note for example the Nobel Prizes in physics of Charles H. Townes et al. for their ‘fundamental work in the field of quantum electronics, which has led to the construction of oscillators and amplifiers based on the maser-laser principle’ (Nobel Prize 1964), of Nicolaas Bloembergen and Arthur L. Shawlow, ‘for their contribution to the development of laser spectroscopy’ (Nobel Prize 1981), and, finally, the 1989 physics Nobel Prize awarded to Hans G.Dehmelt and Wolfgang Paul ‘for the development of the ion trap technique’. In some areas of fundamental physics research, control has become increasingly central, where it operates as a key ingredient in the analysis of certain objects and forces. In the course of the 20th century, it has increasingly emerged as a primary instrumentation function, somewhat distinct from detection and metrology. Interestingly, among Nobel Prizes, control has become an end in itself, while the objects of control have grown progressively elusive. In the following sections, we present three instrument configurations established through alternative weavings of instrument environment, trajectory and function: (1) the bounded configuration (autonomous, static, metrology); (2) the extensionist configuration (bureaucracy, massification, detection); (3) the linked configuration (circumscribed, branching, control) (see Table 1).

The worlds of Nobel Prize physics instrumentation: Continuity and change The Nobel Prize foundation was established in 1901, in accordance with the will of Alfred Nobel (1833–96), in order to reward outstanding work in the areas of physics, chemistry, physiology or medicine and, later, literature. The Nobel Prize system provides for the annual attribution of the prize in each of these areas. Laureates are nominated by a body of specialists, and a final decision is voted by the Royal Swedish Academy of Sciences, the Karolinska Institute and the Swedish Academy (Crawford, 1987, 2002). Since 1901, there have been a total of 194 laureates in physics – 26 of them specifically for instrumentation (see Appendix 1). In the following sections, we

8

Social Science Information 00(0)

Table 1.  Configurations and their types. Bounded Autonomous       Static       Metrology    

Extension

Linked

Bureaucratic

Circumscribed    

Massification

Detection

Branching         Control

will mobilize Nobel Prizes to study three configurations in which organizational environment, instrument trajectory and instrument function powerfully express trends in scientific instrument evolution.

The bounded configuration (autonomous, static, metrology): The case of Zernike’s phase contrast microscope The pre-Second World War period can be equated with the development of bounded instrumentation. This configuration of devices is characterized by three key features – the design, conception and utilization of the instrument – and frequently revolves around the efforts of a sole scientist or an extremely small group. The space occupied by bounded devices ranges from tabletop to a modest room, and instrument operation is frequently carried out by one person. For this reason, the work environment is largely autonomous. The instrument trajectory of the configuration of bounded devices is static. These devices are static in the sense that they do not evolve directly from predecessors, and they do not spawn or integrate subsequent devices. In a word, they have ancestors but do not give rise to subsequent branching.5 However, as will be seen below, in the realm of instrumentation, the notion of static does not signify obsolescence. Finally, with reference to instrument function, bounded devices often perform metrological tasks (as opposed to detection or control). As already indicated in the first part of the article, metrology is the science of weights and measures, position and shapes. In the case of many contemporary scientific research areas, metrologies render information in the form of images. The bounded instrument configuration is illustrated here by the Frits Zernike’s microscope. Frits Zernike (1888–1966) was a Dutch physicist and winner of the Nobel Prize for physics in 1953 for his invention of the phase contrast microscope, an instrument that permits the study of internal biological cell structure without the need to stain and thus to kill the cells. He developed this apparatus in 1935.

Marcovich and Shinn

9

Zernike studied chemistry (his major), mathematics and physics at the University of Amsterdam. In 1912 he was awarded a prize for his work on opalescence in gases. The following year, he became Jacobus Cornelius Kapteyn’s assistant at the Groningen University astronomical laboratory. In 1914 he was responsible, jointly with Leonard Salomon Ornstein, for deriving the Ornstein–Zernike equation in critical-point theory. He then moved on to a position in theoretical physics at the same university, and in 1920 he was promoted to full professor of theoretical physics. In a word, Zernike exhibits a remarkable degree of intellectual mobility, which can be interpreted as a marker of autonomy. The mix of Zernike’s intellectual interests contributed directly to his instrument innovation in optics. His research here revolved around the mathematical treatment of optical aberrations. The representation of aberrations was originally based on the theory developed by Ludwig Seidel in the middle of the 19th century. Seidel’s representation was based on power series extensions and did not allow a clear separation between various types and orders of aberrations. Zernike’s orthogonal circle polynomials provided a solution to the long-standing problem of the optimum ‘balancing’ of the various aberrations of an optical instrument. Since the 1960s, Zernike’s circle polynomials have been widely used in optical design, optical metrology and image analysis.6 In the following, we will briefly describe Zernike’s phase contrast microscope. The device’s achievement is to make the invisible visible. Traditional microscopes improve the visibility of the illuminated object under study. But if this object is transparent, there is no contrast with its environment and it remains invisible. The power of Zernike’s device resides in its capacity to increase contrast between the different illuminated regions, thus allowing the previously transparent, invisible region to become apparent. This instrument achievement was not developed in the framework of microscopy apparatus research. In his Nobel Prize lecture Zernike says: ‘“Phase contrast” was not discovered while working with a microscope, but in a different part of optics. It started from my interest in diffraction gratings …’ (Zernike, 1953). Much of Zernike’s specifically metrological work revolves around the creation and construction of his phase contrast microscope. Indeed, microscopy may be viewed as paradigmatic of metrology in the sense that it allows observation of the shape, position and size of entities. Zernike’s research work on the device constituted the heart of his activities from about 1930 to 1950.7 In 1930, Zernike was conducting research into spectral lines when he discovered that the so-called ghost lines that occur to the left and right of each primary line in spectra created by means of a diffraction grating have their phase shifted from that of the primary line by 90 degrees. Here the ghost lines become the center of attention and the key to physical intelligibility. From this observation, he developed the famous microscope, which allowed him to see what had previously not been visible. This was not achieved through magnification but instead through enhancement of luminosity of zones that formerly only existed as background. As evoked above, the main contribution of the phase contrast microscope is that it allows the observation of transparent objects, which makes it of fundamental interest to biology. Microbiological objects are transparent. Thus, previously, it was necessary to stain them and so to work on dead systems. With the phase contrast metrological method, it is possible to study living organisms – cells’ organelles and membranes, their shape, dimensions and position. Indeed, as far back as 1933, Zernike introduced his phase contrast technique in microscopy in a Physical and Medical Congress.

10

Social Science Information 00(0)

Figure 1.  Phase contrast Microscope.

https://www.microscopeworld.com/c-426-phase-contrast-microscopes.aspx.

Zernike’s work fits into the research environment that we term ‘autonomous’. First of all, consider the size of the instrument itself. Zernike’s device can fit comfortably on the top of a desk (see Figure 1). It is operated by a single person, and requires no auxiliary staff. This kind of instrument does not demand a budget. What could be more autonomous? This contrasts significantly with the manpower, space and money necessary in the case of the linked configuration of Haroche’s instrument (see below). As appreciated through the Nobel system lens, it is important to note that the bounded configuration is confined to the first half of the 20th century. The trajectory of the phase contrast microscope is static. It has not spawned new species of devices. Georges Nomarski’s differential interference contrast microscope (1952) and Robert Hoffman’s modulation contrast microscope (1975) constitute updates and improvements to the Zernike phase contrast microscope, but they cannot be counted as representing branching instruments. Nevertheless, today’s Zernike microscope is coupled to a computer, whose screen reproduces the magnified target object. It is to be noted that the contrast phase microscope continues to be widely used, particularly in biology and medicine. The last publications directly concerned with Zernike’s phase contrast microscope are dated June 2016.

The extensionist configuration (massification, detection, bureaucracy): The case of particle detectors Extensionist research instrumentation is relatively new in the history of science. Among the first extensionist instruments, one can point to the 1930s construction of the famous Mount Palomar telescope (Florence, 1994), to the Bellevue giant electromagnet (Shinn,

Marcovich and Shinn

11

1993) and to the Berkley accelerator (Heilbron & Seidel, 1989). Our discussion of this instrumentation revolves around a series of giant particle-detection devices. Such apparatus associate: the trajectory of massification, the instrument function of detection and, finally, the instrument environment expressed as bureaucracy. The underlying logic of extensionist instrumentation can be best understood as a process of ongoing increase in scale – bigger devices, more sensitive detectors and growing bureaucratization. We will document the structure and dynamics of this type of instrument growth with reference to apparatus that detect a family of objects – cosmic and gamma rays, muons and nuclear particles … We will discuss the operation and logic of instrument massification through the chain of devices that began with the Wilson cloud chambers (Nobel Prize 1927), the Glaser bubble chamber (Nobel Prize 1960) and the Big European Bubble Chamber (BEBC). The reader may reasonably ask what is the difference between the ‘branching’ instrumentation trajectory for linked instrumentation (see below), and the massification trajectory of the instruments we are about to describe. The logic of instrument branching is the transformation and evolution through genealogy and combinatorials/weaving. The logic of instrument massification is the dramatic growth in size yet with the retention of the instrument’s underlying principles and logic. As will be seen in below (‘The linked configuration (circumscribed, branching, control)’), branching is a matter of change as opposed to the massification story so essential to instrument continuity. The story begins with the invention of the cloud chamber by Charles Thomson Rees Wilson (1869–1959), for which he was awarded the Nobel Prize in 1927 ‘for his method of making the paths of electrically charged particles visible by condensation of vapour’.8 Wilson was first trained at Manchester University in biology (intending to become a physician), and then went on to Cambridge, where he worked with Joseph J. Thomson and Ernst Rutherford in atomic physics. He designed and constructed an apparatus related to optical and atmospheric phenomena capable of detecting particle tracks (Clinton, 1997). During this process, he observed that moist air freed from dust particles could nevertheless generate a cloud if the expansion and consequent supersaturation exceeded a certain limit. Wilson observed tracks, which he understood to be generated in the supersaturated cloud. The scientist harnessed his new device to an x-ray tube, which offered the necessary excitation to produce ions. He now understood that x-rays could produce nuclei of the same sort as those occurring in the air of the cloud chamber. This device quickly allowed Wilson to study features of positive and negative ions. From the very outset, he made his cloud chamber available to colleagues working in neighboring fields of nuclear and ion research. Such availability is an important characteristic of the category of extensionist instrumentation. We emphasize the shared, communal profile of the use of such instruments (Wilson, 1927). The Wilson cloud chamber was a desktop device (see Figure 2), whose operation could be mastered by a single person. In that sense, it is autonomous. By 1923, Wilson had perfected his chamber and introduced stereoscopic photography. Photography proved important because he could study at will the tracks of particles detected. Among other results, the photographs he obtained documented the reality of the Arthur H. Compton effect by showing the existence of Compton recoil electrons.9

12

Social Science Information 00(0)

Figure 2.  Wilson Cloud Chamber.

The potentialities that Wilson’s apparatus exhibited – the detection of atomic particles and their quantum properties, the capacity to exercise the instrument in combination with the variety of radiative sources and the practice of sharing the device with scientists pursuing related objectives – opened the way to development of an ever bigger detection instrument. A significantly larger intermediary instrument was designed by Patrick M.S. Blackett (1897–1974), who was awarded the Nobel Prize in 1948. Beginning in the 1950s, the instrument designed by Donald A. Glaser (1926–2013) succeeded the smaller devices of Wilson and Blackett; this was the bubble chamber for which he was awarded the Nobel Prize in 1960 (Poggio, 2013). Glaser was trained in mathematics and physics at Caltech. While there, he began to work with cloud chambers, which he found unsatisfactory for technical reasons. Glaser thus designed and constructed a bigger apparatus. His chamber was based on the same principle as those of Wilson and Blackett, that of producing tracks of particles in a fluid milieu. In this respect, the bubble chamber constitutes an extension of the Wilson cloud chamber – same objective, similar concepts but different approaches (Galison, 1997). The two apparatus offer data in the form of visual information – as tracks with reference to length and shape. Analysis and understanding totally revolved around the profile of tracks. To this extent, they are similar. However Glaser’s device constitutes a massification along six lines. First, the bubble chamber was frequently several orders of magnitude bigger than the Wilson cloud chamber. Recall that the Wilson cloud chamber sits comfortably on top of a desk. The bubble chamber shown in the photograph (Figure 3) measures four meters in diameter. Second, greater detector size privileges observation of track curvature, which corresponds to subatomic particle momentum. Third, it introduces a huge magnet in order to generate the necessary magnetic field. The scale of this magnet constituted a crucial part of massification instrumentation. Fourth, the number of detection cameras was tripled. Fifth, unlike the cloud chamber, the operating system of the Glaser chamber did not have to be constantly reset, which allowed continuous monitoring of the unfolding of events. The result was a million-fold massification in the number of photos, which occasionally offered golden events. Sixth, and finally, the bubble chamber generally operated in combination

Marcovich and Shinn

13

Figure 3.  Bubble chamber.

with particle accelerators. When taken together, by comparison with Wilson’s chamber, Glaser’s apparatus is a marvel of incremental technical complexity and of massification (Galison, 1997). This massification trajectory powerfully impacts the instrument organizational environment. The material size of the device, the work associated with operating it and the huge expense of the Glaser bubble chamber apparatus and its successors necessitate a bureaucratic research environment. The organization of work related to these photographs illustrates the bureaucratic character of this instrument. The photographs of the particle tracks provided by the cameras had to be individually studied for identification of target events. Armies of people who were not qualified in physics were employed to scrutinize the thousands of images produced during experiments in order to identify relevant information. Work entailed routinized, standardized and administratively controlled practices (Galison, 1997). This work was just one step in a highly coordinated and strongly differentiated and organized research regime. Instrument bureaucratization, like bureaucratization in general, entails: an astronomic budget, management, planning, division of labor (specialists from many different domains), geographic considerations concerning the actors, coordination, in our case national and international research policy and, of course, politics. Here the scale and technical as well as social complexity of the instrument generates a kind of tightly meshed and scheduled instrumentation environment. This is nowhere clearer than in the case of the BEBC, which was initiated and installed at CERN between 1964 and 1984, and was used to study particle physics.10 The period between the conception of the BEBC instrument

14

Social Science Information 00(0)

program and the device’s coming on line with all its complexity ran for over six years, beginning in 1964. Debate was heated and prolonged concerning technology, budget and institutional responsibility. Authority was divided between the French and German research ministries and, to a lesser extent, other European governments (Sweden, Holland, Switzerland …), the Commissariat à l’Energie Atomique (CEA), CERN … Industrial actors, such as Creusot Loire Metallurgie, were charged with component construction. Planning the instrument was not simply a matter of design and contested control, but also a matter of money. At one point, the budget for the BEBC stood at 84 million 1966 francs. How much more bureaucratic can an instrument’s organizational environment be? The gigantic proportions of the device suggest the extent to which planning and management were required (Krige, 1996). The diameter of the BEBC vessel was 3.7 meters, and it stood 4 meters in height. It incorporated some 350 tons of stainless steel and was filled with 35 cubic meters of liquid, whose pressure was regulated by means of a huge piston weighing 2 tons. The vessel was surrounded by a giant superconducting magnet.11 The differentiated organization of work associated with the BEBC was acute (researchers, engineers and workers). During its active life, the BEBC generated about 6.3 million photographs in the course of 22 experiments devoted to neutrino or hadron physics. The BEBC constituted an important node of convergence for the research of scientists in a large number of laboratories. Over 600 researchers from more than 50 laboratories throughout the world visited CERN and used the BEBC during its lifetime. The circulation of these scientists coming to the instrument and returning to their home base had to be planned, since each experiment required appropriate organization.

The linked configuration (circumscribed, branching, control): The case of Serge Haroche and photon trapping (Nobel Prize 2012) In the following discussion of scientific instrumentation, we point to an infrequently and little-understood crucial aspect of the relationship between instrumentation and knowledge. It is commonly argued that instrumentation is a source of understanding of the physical world. In this account, instrumentation is an enabling device. In our interpretation, this is also true. However, our view of instrumentation additionally reveals that, in some instances, knowledge itself cannot be dissociated from a research instrument. In this case, instrument and learning are one. The Nobel Prize winning research of Serge Haroche (2012) (along with David J. Wineland) exemplifies the linked configuration. Haroche received the award for ‘groundbreaking experimental methods that enable measuring and manipulation of individual quantum systems’.12 The French physicist, Serge Haroche (born in 1944), was admitted to both the illustrious École Polytechnique and the École Normale Supérieure, and chose to pursue his university-level studies at the latter. He is specialized in atomic physics and quantum optics. After his PhD on ‘dressed atoms’, under the supervision of Claude Cohen-Tannoudji (himself a Nobel Prize recipient along with Steven Chu and William Daniel Philips 1997), extending from 1967 to 1971, and afterwards post-doctoral research with Arthur Leonard Schawlow (Nobel Prize in physics instrumentation 1981) at Stanford in 1972–73 on laser spectroscopy, Haroche went on to develop new methods for laser spectroscopy. He then moved on to the study of Rydberg atoms (giant atomic states

Marcovich and Shinn

15

particularly sensitive to microwaves), which are well adapted for exploring the interactions between light and matter. He showed that such atoms, coupled to a superconducting cavity containing a few photons, are well-suited to the testing of quantum decoherence and to the realization of quantum logic operations necessary for the treatment of quantum information. Haroche was the director of the physics department of the École Normale Supérieure from 1994 to 2000, and was appointed to the quantum physics chair at the Collège de France in 2001. The branching type of instrument trajectory of Haroche’s device is, among others, grounded in the contribution of Arthur Kastler (Nobel Prize 1966) and the above-mentioned instrument work of Claude Cohen-Tannoudji. Kastler’s contribution revolved around the optical pumping method, which uses light beams to orient the magnetic moments of atoms. Cohen-Tannoudji’s input consists in the impact of manipulating matter with light. Previously, it was only possible to manipulate and observe large ensembles of billions of atoms contained in a resonance cell. A third branching component central to Haroche’s new instrumentation knowledge was the cavity quantum electron detector (QED) interferometer, developed in the late 1940s by Norman F. Ramsey (Nobel Prize 1989). This may be seen as a heroic branching, where giants stand on the shoulders of giants. The instrument knowledge characteristic of Serge Haroche’s research also contains a large number of more subtle branches. The substance of branching involves the collection and combination of interwoven pieces of information and technology in order to address and resolve a research problem. In the case of Haroche, combinatorials and their weaves intermesh with the Kastler-Tannudji-Ramsey instruments just described. Combinatorials are components that come from one or many horizons; these work together and form an association addressing a specific problem. Combinatorials and their association are particularly important for instrumenting science (Marcovich & Shinn, 2014: particularly Chs 1 and 4). The combinatorials in the instrument trajectory of Haroche are connected to his instrument function type, which is control. Haroche’s instrumentation knowledge contribution consists in the capacity to isolate and control individual atoms and/or photons. Their behavior becomes deterministic, as opposed to the behavior of quantum electrodynamics. The research field that was at stake, which deals with atoms and photons interacting in a space confined by mirrors, is known as cavity quantum electrodynamics. The objective of control was everywhere expressed in the instrument. ‘Our ambition back at the end of the 1980s became to generate photons in a high Q cavity and to observe and manipulate these photons without destroying them’ (Haroche, 2012). In order to increase the reflectivity of the mirrors, the copper was replaced with superconducting niobium, cooled to a few degrees Kelvin. Acquisition of a tunable dye laser was one of the first steps in obtaining a single and controllable atom. The goal of obtaining a single atom bouncing back and forth inside the cavity and emitting one photon also entailed specific dimensions. The maintenance of bouncing involved constant observation and measurement, and when necessary the injection of additional controlled energy. Haroche’s above-described work is an exquisite example of instrument knowledge. In the case at hand, the switch from indeterminacy toward determinacy and the issue of control converge powerfully. One may ask whether these two elements constitute limiting conditions of instrument knowledge. And beyond this, what alternative scenario can accord with this form of knowing?

16

Social Science Information 00(0)

Figure 4.  Haroche group.

The material, technical and intellectual characteristics of this device correspond to a specific type of instrument work organization environment, the circumscribed type. Haroche’s instrument fits into the moderate dimensions of a laboratory space of a circumscribed environment: it is located in a mere two rooms. This contrasts with the desktop dimension of an autonomous device and with the vast construction complexes of bureaucratic environments. The size of the group of scientists involved in the photontrapping project corresponds to the second marker of the circumscribed environment. The Haroche group functions on the basis of a mere one or two dozen researchers, engineers and advanced students. Their highly specialized complementary knowledge and skills focus on the same target. Another powerful characteristic of the circumscribed form of research environment is teamwork and a spirit of comradeship (see Figure 4). Finally, the budget of this laboratory is just a few million Euros, as compared to the hundreds of millions required for the bureaucratic environment.

Where are scientific instruments going? This section focuses on issues of continuity and transformations of research instrumentation. The history of Nobel physics instrumentation falls into two chronological periods – the dividing line is 1950. The 11 devices of the first half of the 20th century are markedly oriented toward one configuration – the bounded configuration numbers seven devices, the extension configuration includes two apparatus, the linked configuration

Marcovich and Shinn

17

none, and two instruments are what we term hybrid. A hybrid device is an instrument that does not obey the alignment of stipulated types constitutive of our three configurations. Here one observes the assembly of types in combinations that do not follow the logic imposed by our categories. The second half of the 20th century (numbering 15 instruments) is for its part dramatically monolithic – ten devices correspond to our category of linked instrumentation, one belongs to the bounded configuration, one to extension and three are hybrid. We will treat the five hybrid cases at the end of this section. The bounded configuration that is predominant during the first period is, as described above, composed of the autonomous work organization, a static trajectory and of the function of metrology. Instruments such as the Michelson interferometer (Nobel Prize 1907), the Rabi nuclear magnetic resonance instrument (Nobel Prize 1944) and the Shull neutron diffraction technique (developed in 1946, Nobel Prize in 1994) typify the bounded configuration for the period. Only one bounded configuration instrument appeared in the second period, which in the history of the Nobel Prize spells its closure. The invention and operation of all these devices involved a single person or a very small group of researchers, instruments were static in the sense that they did not spawn generations of evolving apparatus, and their function was steadfastly measurement. How can one explain this? In the early decades of the 20th century, the science and industry of research instrumentation was not yet a differentiated domain of activity, it was still in its youth (Baird, 2004; Heilbron & Seidel, 1989; Reinhardt, 2006). In many ways, it was artisanal even in its most advanced form (except for Germany, which was the center of advanced research instrumentation), and it was not a flourishing and recognized science, which only came to pass after the Second World War (Shinn, 2008). The science of the day frequently focused on the discovery or the characterization of things previously unknown and not even dreamed of, such as radioactivity or fission, and nuclear magnetic effects. This context shaped the aspirations and design of contemporary devices. Research associated with characterization is frequently satisfied within a framework of measurements that can be performed adequately by solo devices. Here there is little need to assemble or complexify investigative apparatus through the introduction of incremental combinatorials. Simplicity suffices. So why change? Improvements in such devices do not lead to branching but instead to intra-device sophistication. These instruments manifested a high degree of continuity. Tradition is stronger here than transformation, yet this has not impaired their long-term utilization. Indeed continuity is not necessarily synonymous with obsolescence. The early decades of the 20th century offer two examples of the extension category of instrumentation, along with two examples of hybrid instrumentation that are not covered by our typology. It should be remembered that, during this period, the bounded configuration of instrumentation was assertively dominant. The logic of the extension category of apparatus exhibits a characteristic that is unique in our typology of instrumentation structure. In the case of bounded and linked devices, there exists a balance between the three types composing them – function, trajectory and work organization. By contrast, in the case of extension, the detection function underpins and animates both the trajectory and the work organization of the instruments. In the Nobel examples we encounter, this rule covers the long 20th century.

18

Social Science Information 00(0)

In effect, as we have seen above, this is evidenced in the Nobel Prizes of Wilson (1927), Blackett (1944) and Glaser (1960), all of which deal with particle and ray detection. However, it must be added that the instrument logic of the extension configuration applies equally well to domains not covered by the Nobel Prize such as astronomy (Lambright, 2014) and oceanography. What interests us here is the logic of the dynamics led by the detection function, which propels the other parameters of the configuration (the massification trajectory and the bureaucratization form of work organization) toward ever-greater instrument complexity and incremental size. Technical complexification means the constant addition of adjacent apparatus, which in turn leads to what we have termed ‘massification’. The desktop Wilson cloud chamber grows into the huge multi-ton complex bubble chamber detector developed by Glaser, which eventually evolved into the multi-kilometer instrument located at CERN. On a parallel register, the logic of massification has required recruitment and collaboration of innumerable technical and administrative as well as scientific specialists, whose numbers and differentiation call for a bureaucratic work organization. Such extension instruments cannot escape the dictatorship of bureaucracy. The period from 1950 to today has been almost entirely dominated by the linked configuration, where the circumscribed organizational form of instrument work is paramount, along with control and branching. In addition to the case of the Haroche team and photon trapping described above, other instances of Nobel Prize winners grounded on the linked configuration of control function, branching and circumscribed work environment include Arthur Kastler’s prize for ‘the discovery and development of optical methods for studying Hertzian resonances in atoms’ (optical pumping) (1966), Hans Dehmelt and Wolfgang Paul’s for their ‘instrument developed for trapping ions’ (Nobel Prize 1989), or Steven Chu, Claude Cohen-Tannoudji and William D. Phillips’ ‘for development of methods to cool and trap atoms with laser light’ (Nobel Prize 1997). Note that all these instruments belong to the class of devices which create phenomena. In all, there have been ten linked configuration instruments in the post-1950 period, compared with one bounded, one extension and three hybrid. The devices corresponding to the linked configuration are not only technologically intricate, but, more significantly, they are also highly complex. The requirement of an internally differentiated workforce to design, construct and use them is just one marker of the material and intellectual complexity associated with this form of instrumentation. This complexification is further fueled by technological branching, where previous generations and different species of devices converge and fuse into a single, novel machine. What could be more emblematic of the centrality of transformation? The instrument function of control constitutes the foundation of the power of this instrument configuration. The achievement of control-related instrumentation is the capacity to isolate, purify and transform matter – including creating new matter and phenomena (Bachelard, 1938; Hacking, 1983; Marcovich & Shinn, 2014). Taken together, control, instrument branching and circumscribed work organization – the linked configuration – form a virtuous circle that perpetuates transformation. Indeed since the late 1950s and the intellectual and institutional rise of materials engineering, a proclivity has developed in important segments of the global scientific community to reorient research toward the exploration of artificial materials. It is now

Marcovich and Shinn

19

possible to fabricate materials and phenomena that become objects of exploration. This is part of the essence of nanoscale research. This turn toward control has been richly rewarded by the Nobel Prize committee, as seen above. One can observe that the instrument function, control, invites the two other parameters of this configuration – the branching trajectory and the circumscribed work environment. The technical complexities inherent in control-based instruments necessarily call for and mobilize interfacing with a variety of tightly combined annex devices. This was explicit in our above description of Haroche’s apparatus that enables the trapping and examination of single photons. Before this period, the technical capacity and possibilities of instruments to interface were not such that penultimate control could be realized. It is this penultimate control that is constitutive of instrument knowledge. The breadth and combination of apparatus characteristic of control mobilize specialists from numerous domains. Of utmost importance, the logic of activity must revolve around integration of specialties and even complicity. This goes far beyond the association characteristic of convergent divisions of labor proper to bureaucratic organization. The Nobel Prize for instrumentation includes five devices that do not square with our three configurations. These went to: Ernest Orlando Lawrence, ‘for the invention and development of the cyclotron and for results obtained with it, especially with regard to artificial radioactive elements’(1939)13; Percy Williams Bridgman, ‘for the invention of an apparatus to produce extremely high pressures, and for the discoveries he made therewith in the field of high pressure physics’ (1946) (Walter, 1990)14; Dennis Gabor, ‘for his invention and development of the holographic method’ (1971)15; Gerd Binnig and Heinrich Rohrer, ‘for their design of the scanning tunneling microscope’ (1986)16; and Georges Charpak, ‘for his invention and development of particle detectors, in particular the multiwire proportional chamber’ (1992).17 However, all of these instruments exhibit at least one of the types of our configurations and sometimes even two. The Binnig and Rohrer’s Scanning Tunneling Microscope possesses the three instrument functions (metrology, detection, control) and it is both autonomous and branching (see Appendix 2). We introduce the birth and evolution of one hybrid instrument here in order to illustrate in what way and how renegade devices violate our instruments typology. The case of Lawrence’s cyclotron will illuminate this point: the design and construction of the device was such that it required more than the capacity of an individual, yet it was not so complex that it needed an army of scientists, engineers and technicians. The famous accelerator was conceived and constructed by a small cadre of researchers. In a word, it was a circumscribed work organization. Its function was clearly control. The manufacture of high energy and the generation of new chemical elements depended on the capacity to accelerate electrons and select atomic particles. In a word, Lawrence’s cyclotron was a control apparatus. As the demand for ever-higher energy particles emerged, the cyclotron grew ever greater in size, ultimately spawning a huge accelerator at CERN. The tiny team working at the Berkley Laboratory in the 1930s has since become a huge bureaucratic army. This transformation is a model hybrid that escapes our instrumentation typology. Hybrid cases may be viewed as ventures associated with unforeseen forms of instrument elasticity in their relationships between their function, trajectory and social organization.

20

Social Science Information 00(0)

Conclusion Our century-long historical overview reveals a radical shift in the development of research instrumentation in physics, as evidenced in the Nobel Prizes. The first half of the 20th century was dominated by the ‘bounded’ configuration of devices (seven prizes out of 11, and none for ‘linked’), where the characteristics metrology, static and autonomous were paramount. Science seen through the lens of instrumentation here reinforces the historical category of ‘little’ science for the first half of the century. Our typology would suggest that metrology lies at the very heart of ‘little’ science. The primacy of autonomous instrument work organization and the fact that there was little trans-instrument intersecting argues in the same direction. Contrary to current belief, if viewed through the lens of instrumentation, the latter half of the 20th century has not been the province of big science. Our instrument typology suggests instead that the linked configuration (ten awards out of 15 versus only one for bounded) has been and is predominant, as seen through the Nobel Prizes. This constitutes a clear victory for medium-scale research ventures. Note that big science (as illustrated by a bureaucratic work organization and massification) has four victories (Lawrence’s instrument and those of Blackett’s, Glaser’s and Charpak’s) – two belong to the extension configuration and two are hybrids. In the case of our typology, the general morphology is best illuminated by instrument function, which tends to structure the other parameters. Finally, many of the Nobel Prize linked instruments of the last half of the 20th century illustrate the new epistemological turn that we have termed ‘instrument knowledge’. The relevant instruments are designated by a star in the list of Nobel Prizes featured in Appendix 1. One of the unanticipated consequences of our exploration of Nobel Prize physics instrumentation has been the discovery of instrument knowledge. Indeed a large portion (11 out of 19) of the research awarded the Nobel Prize after 1950 in instrumentation physics can be understood as instrument knowledge. Instruments such as Haroche’s device illustrate instrument knowledge on two grounds. First of all, they succeed in extinguishing the noise of nature and in isolating the species of phenomena of interest to the researcher – in the case of Haroche’s device, individual photons or atoms operating deterministically. Second, they enable control over the phenomenon. Both of these capacities were sometimes present in older generations of apparatus. However it is the combination of the capacity to extinguish the noise of nature and to impose control at the same time that underpins our instrument knowledge. The instrument achieves both objectives. What one ‘knows’ about the phenomenon is largely encapsulated in what one knows about and can do with the instrument. Stated differently, one can argue that today much of what a scientist has to say about their results is articulated in the language of their instrumentation. The knowledge is embedded in the instrument, and the results obtained from the instrument count as knowledge. The consideration of entities developed within instrumentation becomes a cornerstone of intelligibility in science. In other words, today the instrument inspires the phenomenon, and the intelligibility of the phenomenon is rendered intelligible in the language of the instrument. This is instrument knowledge. Let us be clear: The emergence and place of instrument knowledge in no way eclipses the epistemology of experimentation or theory. Instrument knowledge, theory and instrumentation can be understood as constituting a complementary epistemological triangle.

Marcovich and Shinn

21

Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Notes   1. We are aware of the possibility that the selection of Nobel instruments may importantly affect some of our observations since the Nobel committee’s choices may sometimes reflect decisions to promote a specific category of device or the salience of a mode. Whatever the case, we consider that the instruments presented here are illustrative of 20th-century instruments history.   2. In the discipline of chemistry, a mere five instruments have received the award. The following prizes could well instead have been classified in physics: In 1998 the Density Functional Theory methodology of J.Pople and W.Kohn (see Johannes Lenhard, this volume); in 1991 R. Ernst, ‘for or his contributions to the development of the methodology of high resolution nuclear magnetic resonance (NMR) spectroscope’; and E. Betzig, S. Hell and W. Moerner for the development of super-resolved fluorescence microscope in 2014 (Hentshel, 2015). In the discipline of medicine, two Nobel Prizes went to instrumentation: A. MacLeod Cormack and G. Newbold Hounsfield, for ‘the development of computer assisted tomography’, in 1979 (Blume, 1991); and P.C. Lauterbur and P. Mansfield, ‘for their discoveries concerning magnetic resonance imaging’, in 2003.   3. Our nine types offer 27 combinations. Of the 27 possibilities, three emerge as historically dominant. Our selection of types is largely rooted in the empirical information associated with Nobel Prize instruments laureates.   4. Some examples of this include: Charles H. Townes, for ‘for fundamental work in the field of quantum electronics, which has led to the construction of oscillators and amplifiers based on the maser-laser principle’ (Nobel Prize 1964); Arthur Kastler, ‘for the discovery and development of optical methods for studying Hertzian resonances in atoms’ (Nobel Prize 1966); Nicolaas Bloembergen and Arthur L. Schawlow, ‘for their contribution to the development of laser spectroscopy’ with the other half of the prize going to Kai M. Siegbahn, ‘for his contribution to the development of high-resolution electron spectroscopy’ (Nobel Prize 1981).   5. Zacharias Janssen (1585–1632) was first credited with the invention of the microscope. It is only from the 1660s and 1670s that the microscope was used extensively for research in Italy, the Netherlands and England. The greatest contribution came from Antonie van Leeuwenhoek (1632–1723), who achieved up to 300 times magnification.   6. See https://en.wikipedia.org/wiki/Frits_Zernike; Van Berkel et al. 1999.   7. The scientist’s career-long involvement in metrology is visible in his contributions to general optics. Zernike’s commitment to microscope metrology clearly emerges in his publications from 1933 to 1950. This is evident from the following journal publication titles: ‘Inflection theory of the cutting method and its improved form, the phase contrast method’ (Zernike, 1934); ‘The phase contrast process in microscopic examinations’ (Zernike,1935); ‘Phase contrast, a new method for the microscopic observation of transparent objects’ (Zernike, 1942); and ‘Color phase-contrast microscopy – requirements and applications’ (Zernike, 1950).   8. See http://www.nobelprize.org/nobel_prizes/physics/laureates/1927/.   9. Arthur H. Compton was awarded the Nobel Prize the same year as Wilson (1927). 10. This mammoth device never received a Nobel Prize. 11. See https://en.wikipedia.org/wiki/Big_European_Bubble_Chamber. 12. See https://www.nobelprize.org/nobel_prizes/physics/laureates/2012/. 13. This instrument is characterized by the instrument function of control, the massification trajectory and the circumscribed work organization becoming bureaucratic.

22

Social Science Information 00(0)

14. This is instrument is characterized by the instrument function of control, the static trajectory and the by the circumscribed work organization. 15. This is instrument is characterized by the instrument function of metrology/control, the static trajectory and by the autonomous work organization. 16. This instrument is characterized by the instrument functions of metrology, detection and control; its trajectory is branching and its work organization is autonomous. 17. This instrument is characterized by the detection function, its branching and massification trajectories, and its circumscribed and bureaucratic work organizations.

References Bachelard G (1938) La formation de l’esprit scientifique. Paris: J. Vrin. Baird D (2004) Thing Knowledge: A philosophy of scientific instruments. Berkeley, CA: University of California Press. Barkan D (1999) Walther Nernst and the Transition to Modern Physical Science. Cambridge: Cambridge University Press. Blume S (1991) Insight and Industry. Cambridge, MA: MIT Press. Bromberg J (1991) The Laser in America, 1950–1970. Cambridge, MA: MIT Press. Butrica A (1996) To See the Unseen: A history of planetary radar astronomy. NASA History Office. Chadarevian S de, Hopwood N (eds) (2004) Models: The Third Dimension of Science. Palo Alto, CA: Stanford University Press. Choi H, Mody C (2009) The long history of molecular electronics microelectronics origins of nanotechnology. Social Studies of Science 39(1): 11–50. Clinton C (1997) The most wonderful experiment in the world: A history of the cloud chamber. British Journal for the History of Science 30(3): 357–374. Crawford E (1987) The Beginnings of the Nobel Institution: The science prizes, 1901–1915. Cambridge: Cambridge University Press. Crawford E (2002) The Nobel population 1901–1950: A census of the nominators and nominees for the prizes in physics and chemistry. Tokyo: Universal Academy Press. Elzen B (1986) Two ultra-centrifuges: A comparative study of the social construction of artifacts. Social Studies of Science 16(4): 621–662. Farge Y (2012) L’élaboration du projet ESRF, la coopération européenne dans le domaine du rayonnement synchrotron. Histoire de la recherche contemporaine 1(1): 16–25. Florence R (1994) The Perfect Machine. Building the Palomar telescope. New York, NY: Harper Collins. Galison P (1997) Image and Logic. A material culture of microphysics. Chicago, IL: University of Chicago Press. Hacking I (1983) Representing and Intervening: Introductory topics in the philosophy of natural science. Cambridge: Cambridge University Press. Haroche S (2012) Controlling photons in a box and exploring the quantum to classical boundary. 2012 Nobel Prize in Physics lecture. Available at: https://www.nobelprize.org/nobel_prizes/ physics/laureates/2012/haroche-lecture.pdf (accessed 28 April 2017). Heilbron J, Seidel R (1989) Lawrence and His Laboratory. A history of the Lawrence Berkeley Laboratory. Berkeley, CA: University of California Press. Hentshel K (2015) Periodization of research technologies and of the emergence of genericity. Studies in the History and Philosophy of Modern Physics 52(2015): 223–233. Hess M (1966) Models and Analogies in Science. Notre Dame, IN: University of Notre Dame Press. Hoddeson L, Braun E, Teichmann J, Weart S (eds) (1992) Out of the Crystal Maze: Chapters from the history of solid state physics. Oxford: Oxford University Press.

Marcovich and Shinn

23

Humphreys P (2004) Extending Ourselves. Computational science, empiricism, and scientific method. New York, NY: Oxford University Press. Joerges B, Shinn T (2001) Instrumentation between Science, State and Industry. Dordrecht: Kluwer Academic Publisher. Jungnickel C, McCormmach R (1986) Intellectual Mastery of Nature. Chicago, IL: University of Chicago Press. Katzir S, Lehner C, Renn J (eds) (2010) Traditions and Transformations in the History of Quantum Physics. HQ-3: Third International Conference on the History of Quantum Physics, Berlin, 28 June–2 July. Berlin: Ed. Open Access. Kragh H (2002) Quantum Generations: A history of physics in the twentieth century. Princeton, NJ: Princeton University Press. Krige J (1996) History of CERN, Vol. III. Amsterdam: Elsevier Kuhn TS (1978) Black-Body Theory and the Quantum Discontinuity, 1894–1912. Chicago, IL: University of Chicago Press. Lambright H (2014) Why Mars: NASA and the politics of space exploration. New series in Nasa History. Baltimore, MD: Johns Hopkins University Press. Lenhard J, Küppers G, Shinn T (eds) (2007) Simulation: Pragmatic constructions of reality. Dordrech: Springer Science & Business Media. Marcovich A, Shinn T (2013) Respiration and cognitive synergy: Circulation in and between scientific research spheres. Minerva 51(1): 1–23. Marcovich A, Shinn T (2014) Toward a New Dimension: Exploring the nanoscale. Oxford: Oxford University Press Mitchell D (2012) Measurement in French experimental physics from Regnault to Lippmann. Rhetoric and theoretical practice. Annals of Science 69(4): 453–482. Mody C (2011) Instrumental Community: Probe microscopy and the path to nanotechnology. Cambridge, MA: MIT Press. Morgan M, Morrison M (eds) (2008) Models as Mediators: Perspectives on natural and social sciences. Cambridge: Cambridge University Press. Nye M-J (1976) The nineteenth-century atomic debates and the dilemma of an ‘indifferent hypothesis’. Studies in History and Philosophy of Science 7(3): 245–268. Olesko K (1991) Physics as a Calling: Discipline and practice in the Königsberg Seminar for Physics. Ithaca, NY: Cornell University Press. Pestre D (1992) The decision-making process for the main particle accelerators built throughout the world from the 1930’s to the 1970’s. History and Technology 9: 63–174. Poggio T (2013) Donald Arthur Glaser (1926–2013). Physicist and biotechnologist who invented the bubble chamber. Nature 496(7443): 32. Reinhardt C (2006) Shifting and Rearranging – Physical methods and the transformation of modern chemistry. Sagamore Beach, MA: Science History Publications. Shinn T (1987) Géométrie et langage: La structure des modèles en sciences sociales et en sciences physiques. Bulletin de méthodologie sociologique 16: 5–38. Shinn T (1993) The Bellevue Grand Electroaimant, 1900–1940: Birth of a research-technology community. Historical Studies in the Physical Sciences 24: 157–187. Shinn T (1997) Crossing boundaries: The emergence of research-technology communities. In: Etzkowitz H, Leydesdorff L (eds) Universities and the Global Knowledge Economy. A triple helix of University-Industry-Government relations. London: Pinter, 85–96. Shinn T (2002) The Transverse Science and Technology Culture : Dynamics and Roles of Research-Technology. With B. Joerges, Social Science Information, 2(41):207–251 Shinn T (2007) When is simulation a research technology? Practices, markets, and lingua franca. In: Lenhard J, Küppers G, Shinn T (eds) Simulation: Pragmatic constructions of reality. Dordrecht: Springer Science & Business Media, 187–203.

24

Social Science Information 00(0)

Shinn T (2008) Research-Technology and Cultural Change: Instrumentation, genericity, transversality. Oxford: The Bradwell Press. Simoulin V (2012) Sociologie d’un grand équipement scientifique. Le premier synchrotron de troisième génération. Lyon: ENS Editions. Simoulin V (2016) Les générations de synchrotrons: Des communautés et des équipements au croisement du national et de l’international. Revue française de sociologie 57(3): 495–519. Van Berkel K, Van Helden A, Palm L (1999) Frits Zernike 1888–1966. A History of Science in The Netherlands. Survey, themes and reference. Leiden: Brill, 609–611. von Hippel E (2005) Democratizing Innovation. Cambridge, MA: MIT Press. Walter M (1990) Science and Cultural Crisis: An intellectual biography of Percy Williams Bridgman (1882–1961). Palo Alto, CA: Stanford University Press. Wheaton B (1983) The Tiger and the Shark: Empirical roots of wave-particle dualism. Cambridge: Cambridge University Press. Whitley R (1980) The context of scientific investigation. In: Knorr W, Krohn R, Whitley R (Eds.) The Social Process of Scientific Investigation. Dordrecht: Springer, 297–321. Wilson C (1927) On the cloud method of making visible ions and the tracks of ionizing particles, Nobel Prize lecture, 12 December. Available at: http://www.nobelprize.org/nobel_prizes/ physics/laureates/1927/wilson-lecture.pdf (accessed 28 April 2017). Winsberg E (2010) Science in the Age of Computer Simulation. Chicago, IL: University of Chicago Press. Zernike F (1934) Inflection theory of the cutting method and its improved form, the phase contrast method. Physica 1: 689 Zernike F (1935) The phase contrast process in microscopic examinations. Physikalische Zeitschrift 36: 848–851. Zernike F (1942) Phase contrast, a new method for the microsopic observation of transparent objects. Physica 9(10): 974–986 Zernike F (1950) Color phase-contrast microscopy – requirements and applications. Journal of the Optical Society of America 40(5) : 329–334 Zernike F (1953) How I discovered phase contrast, Nobel Prize lecture, 11 December 1953. Available at: http://www.nobelprize.org/nobel_prizes/physics/laureates/1953/zernike-lecture.html (accessed 28 April 2017).

Appendix 1 List of the 26 Nobel Prizes in instrumentation physics (The instruments associated with instrument knowledge are indicated by an asterisk) 1907 Albert Abraham Michelson, ‘for his method of reproducing colors photographically based on the phenomenon of interference’. 1927 Charles Thomson Rees Wilson, ‘for his discovery of the effect named after him’. 1939 Ernest Orlando Lawrence, ‘for the invention and development of the cyclotron and for results obtained with it, especially with regard to artificial radioactive elements’. 1943 Otto Stern, ‘for his contribution to the development of the molecular ray method and his discovery of the magnetic moment of the proton’. 1944 Isidor Isaac Rabi, ‘for his resonance method for recording the magnetic properties of atomic nuclei’.

Marcovich and Shinn

25

1946  Percy Williams Bridgman, ‘for the invention of an apparatus to produce extremely high pressures, and for the discoveries he made therewith in the field of high pressure physics’. 1948 Patrick Maynard Stuart Blackett, ‘for his development of the Wilson cloud chamber method, and his discoveries therewith in the fields of nuclear physics and cosmic radiation’. 1950 Cecil Frank Powell, ‘for his development of the photographic method of studying nuclear processes and his discoveries regarding mesons made with this method’. * 1952 Felix Bloch and Edward Mills Purcell, ‘for their development of new methods for nuclear magnetic precision measurements and discoveries in connection therewith’ (work in 1946). 1953 Frits Zernike, ‘for his demonstration of the phase contrast method, especially for his invention of the phase contrast microscope’. 1954  Walther Bothe, ‘for the coincidence method and his discoveries made therewith’. 1960 Donald Arthur Glaser, ‘for his researches concerning the resonance absorption of gamma radiation and his discovery in this connection of the effect which bears his name’. *  1964  Charles Hard Townes, Nicolay Gennadiyevich Basov and Aleksandr Mikhailovich Prokhorov, ‘for fundamental work in the field of quantum electronics, which has led to the construction of oscillators and amplifiers based on the maser-laser principle’. * 1966 Alfred Kastler, ‘for his contributions to the theory of nuclear reactions, especially his discoveries concerning the energy production in stars’. 1971  Dennis Gabor, ‘for his invention and development of the holographic method’. * 1981 Nicolaas Bloembergen and Arthur Leonard Schawlow, ‘for their contribution to the development of laser spectroscopy’. * 1981 Kai M. Siegbahn, ‘for his contribution to the development of high-resolution electron spectroscopy’. 1986 Ernst Ruska, ‘for his fundamental work in electron optics, and for the design of the first electron microscope’. 1986 Gerd Binnig and Heinrich Rohrer, ‘for their design of the scanning tunneling microscope’. * 1989 Norman F. Ramsey, ‘for the invention of the separated oscillatory fields method and its use in the hydrogen maser and other atomic clocks’. * 1989 Hans G. Dehmelt and Wolfgang Paul, ‘for the development of the ion trap technique’. 1992 Georges Charpak, ‘for his invention and development of particle detectors, in particular the multiwire proportional chamber’. * 1994 Bertram N. Brockhouse, ‘for pioneering contributions to the development of neutron scattering techniques for studies of condensed matter’. * 1994 Clifford G. Shull, ‘for the development of neutron spectroscopy’ (research 1946).

26

Social Science Information 00(0)

* 1997 Steven Chu, Claude Cohen-Tannoudji and William D. Phillips, ‘for development of methods to cool and trap atoms with laser light’. * 2005 John L. Hall and Theodor W. Hänsch, ‘for their contributions to the development of laser-based precision spectroscopy, including the optical frequency comb technique’. * 2012 Serge Haroche and David J. Wineland, ‘for ground-breaking experimental methods that enable measuring and manipulation of individual quantum systems’.

Appendix 2 A hybrid instrument: The Scanning Tunneling Microscope (Nobel Prize 1986) In 1981 a new species of research instrument, one that largely defies many of the previsions presented in our above-indicated typologies, was developed that was capable of research investigations lying far beyond much that could possibly have been expected of a research device. In this case, it was the Scanning Tunneling Microscope (STM), whose amazing characteristics and performance earned it the Nobel Prize in a record time of only five years. It indeed received the Nobel Prize in 1986 (Marcovich & Shinn, 2014; Mody, 2011). What are the capabilities of this device and what are its characteristics? Just how does it defy the previsions we have introduced above in our research instrumentation typology? The instrument was developed at the IBM Zurich laboratory as part of a big research project in the area of superconducting-based informatics. The STM was first intended to resolve technical problems. While the firm rapidly almost abandoned the project, it was nevertheless unofficially and semi-secretly continued by its two principle advocates, Gerd Binnig and Heinrich Rohrer. They developed a proto-apparatus capable of detecting and manipulating single atoms and even creating molecular forms. The inventors at first failed to publish their findings and enjoyed little support from the company. Some junior scientists rallied to their device, however, and the first article appeared in 1983, and the epitome of prizes was awarded in 1986. To refer to our instrument typology, the STM is autonomous, branching and, amazingly, corresponds to all the types in the function parameter – metrology, detection, control. It is autonomous in the sense that it sits comfortably on a table and can be managed by a single operator. STM branching finds substance in different apparatus, among which is the Atomic Force Microscope (AFM), which was developed only shortly after the STM’s birth. The AFM was invented in 1987, and, having even more functions (magnetic, voltaic detection and a tapping function), it encountered immediate success. Note that the connection between autonomy and branching belongs almost exclusively to devices of the first half of the 20th century, which means that, in one sense, the STM can be seen as archaic, a conservative device. Similarly note that at no time in the history of Nobel instrumentation and perhaps at no time at all in instrumentation history can one identify an apparatus that effectively performs all three instrument functions (metrology, detection, control). Indeed, along with simulation, the STM is one of the two most-cited research instruments listed in the Science Citation Index.

Marcovich and Shinn

27

The reader should take note of the fact that the STM is the only a-historic/trans-historic instrument to have been awarded the Nobel Prize, and perhaps the only one of its kind in the entire history of instruments. Most importantly, this device is generic (Shinn, 1997, 2002, 2008), transdisciplinary (it works in physics, chemistry, biology and in multiple physical environments – solid, liquid and gas) and it permits a deterministic approach to atoms and molecules. It deviates in almost every respect from our instrumentation typology and from the instruments on record in the Nobel Prize system. Author biographies Anne Marcovich is a historian and sociologist of medicine and science and based at the GEMASS – Univeristy Paris IV. Her scholarly work focuses on the history of 18th- and 19thcentury European medicine, and on the changing structure of authority in history. More recently Marcovich’s research deals with the intellectual and organizational dynamics of scientific research instrumentation Terry Shinn is a historian and sociologist of science and based at the CNRS in Paris. He has conducted research on the history of French scientific education, the relationships between intellectual structure and social organization and on the transformation of research instrumentation.