Science Migrations: Mesoscale Weather Prediction

0 downloads 0 Views 269KB Size Report
the ways in which a regional research travels into an 'alien' institutional and cognitive territory ... SSS and SAGE Publications (London, Thousand Oaks CA, New Delhi) ...... Phillips and James Hoke worked to customize an NGM ...... Watson-Verran, Helen & David Turnbull (1995) 'Science and Other Indigenous Knowledge.
ABSTRACT This paper presents the history of Belgrade numerical weather prediction models as ‘assemblages’ shaped by the cognitive, social and material circumstances of Tito’s Yugoslavia. It looks at how local researchers customized their model to suit the Balkan weather, IBM computers and socialist science policy, and how they managed to ‘export’ the product internationally. Accommodating the lack of computing power and mountainous topography, the model at first attracted the users with similar concerns. When it was later tested at the US National Meteorological Center, it showed a skill and potential for rapid improvement that led to its implementation in the USA and other national weather services. The success of the Eta model illustrates the ways in which a regional research travels into an ‘alien’ institutional and cognitive territory and how it bears upon the issues of the production and circulation of technoscientific knowledge. It is argued that the difference between the perceptions of international ‘frontier’ technoscience and the local ‘backwater’ adaptation is that the former implies a representational notion in which what matters is a black-boxed ‘fund’ of knowledge, while the latter implies an agenda in which what matters is exactly that – what matters locally. Keywords geography of science, modelling, transfer of knowledge, weather prediction, Yugoslavia

Science Migrations: Mesoscale Weather Prediction from Belgrade to Washington, 1970–2000 Vladimir Jankovic Since the 1950s, it has become conventional to explain the rise of numerical weather prediction (NWP) as a result of numerical analysis of atmospheric phenomena and the rapid development of computing technologies. It is argued that the super-computer shortened the time of forecast preparation and made an impact on theory by placing numerical simulation centre-stage. Jule Charney, a post-war pioneer in NWP, described this as the ‘great psychological stimulus that the very possibility of high-speed computation brought to meteorology’ (Nebeker, 1995: 152), while Philip Thompson noted the rapid change in research that was brought about by the high-power computer that ‘has drastically altered the way in which [we, meteorologists] view our problems’ and which has ‘placed [meteorology] on the same footing as other mature physical science’ (Thompson, 1987: 633). In 1987, the meteorologist Akira Kasahara famously claimed that Social Studies of Science 34/1(February 2004) 45–75 © SSS and SAGE Publications (London, Thousand Oaks CA, New Delhi) ISSN 0306-3127 DOI: 10.1177/0306312704040490 www.sagepublications.com

46

Social Studies of Science 34/1

‘when it comes to the bottom line, whoever has the faster computer will win’ (Nebeker, 1995: 164).1 Other factors, of course, have also been recognized as having played a role in the development of NWP: observation networks, theoretical research, institutional support, international collaboration and exchange of information (Miller, 2001). But as Frederick Shuman wrote: ‘modern NWP began with the invention of the modern computer, and subsequent improvements of NWP have been paced primarily by the advances in computer technology’ (Shuman, 1989: 293). This view informed the decision of the US National Weather Service to reduce its budget from $92 million in 1967 to $85 million in 1979 due to ‘a strong effort to automate, to reduce the need for increasing staff wherever possible by using computers and computer-driven communications and graphics’ (National Advisory Committee on Oceans and Atmosphere, 1982: 23). While the importance of computer systems in the history of NWP cannot be questioned, and while no contemporary model is designed to produce a forecast without the use of the computer, the role of the latter has generally been overstated. That the power of the microchip did not necessarily and always lead to a more accurate simulation of atmospheric processes can be illustrated with the recent Yugoslavian NWP in which a lack – rather than an abundance – of computing infrastructure helped researchers develop a more ‘skilful’ model. Using this fact as a point of departure, my aim is to show how Yugoslavian weather modelling – and other scientific knowledge created in the hitherto unexplored locations and institutions – can open up issues of far broader importance for science and technology studies. One of these is the much-discussed notion of locality of science. The other is the problem of knowledge transfer and the travel of scientific theories. A more specific issue deals with the extent to which scientific research – done as it mostly was in the developed countries during the second half of the 20th century – has excluded (or included) work of scientists who might have suffered from inadequate computing technologies, sporadic international presence and meagre funding. If Kasahara was right, such scientists would have been defeated in a modelling ‘race’. Indeed, the early development of weather prediction supports this conviction, insofar as it shows that the seminal post-World War II research benefited from a generous sponsorship from the highest ranking institutions and federal governments of the developed countries. For example, Jule Charney’s group worked on von Neumann’s Meteorology Project by using ENIAC and the IBM machines at Princeton and Maryland in the late 1940s. Thompson and Platzman gave landmark courses in numerical prediction at Massachusetts Institute of Technology and the University of Chicago in 1953, while the main centres included the US Air Force Laboratory, the British Meteorological Office and the International Meteorological Institute of the University of Stockholm.2 The current state-of-the-art developments in the field, again, take place in academic and other institutions in Western Europe and North America.

Jankovic: Science migrations

47

Given the immense technological, financial and institutional support needed for modern NWP, this situation is perhaps inevitable. But is this the whole story? Could NWP be advanced on a scientific ‘periphery’, in the countries suffering from a lack of resources, organization and manpower? Can significant (and original) contributions stem from the ‘invisible college’ of scientists working in the less developed countries? And if they can, would such knowledge address the problems defined by the ‘centre’ or by the ‘periphery’? My answer to these questions will be in the affirmative, even if it only derives from the case study exploring the regional NWP modelling in the Socialist Federative Republic of Yugoslavia and the impact of this research on NWP in the USA during the 1970s and 1990s. The Yugoslavian series of models, I argue, resulted from a confluence of that nation’s scientific tradition, material infrastructure, local economy and politics of knowledge. How did it happen that such confluence resulted in the creation of one of the most advanced NWP products used for operational purposes around the world? If Belgrade’s models embodied the circumstances found in a specific geographic and geopolitical domain, does a translation of these circumstances into the models’ design provide us with a sufficient explanation for its international success? In short, my intention is to show that and how, in the sphere of technoscientific knowledge, material contingencies translate into cognitive robustness, and idiosyncrasies become rules. This paper will thus raise the problem of the relationship between science and regionality/locality.3 The issue has been pertinent both to scholars interested in the engagements between varieties of cognitive cultures (e.g. Southern vs Northern, indigenous vs global, traditional vs modern), and to those aiming to provide a conceptual framework for analysing the contingencies responsible for shaping technoscientific practices. Some scholars have focused on regional scientific ‘cultures’, others on ‘national styles’, and some on the interchanges between intellectual centres and peripheries or between ‘native’ and ‘´emigré’ approaches. Others have produced ‘reception studies’ or have worked to chart cognitive varieties due to the institutional peculiarities or differences in cultural hierarchization of scientific practice.4 Jacqueline Cramer and Rob Hagendijk, for example, showed the ways in which some Dutch universities during the 1960s created their own approaches in adapting the international trends in systems ecology and even used an entirely independent approach based on local data (Cramer & Hagendijk, 1985).5 Malcolm Nicholson argued that the less taxonomic system of plant classification in the USA reflected the practical priorities of a ‘Land-Grant-college science’, whereas the more taxonomic French method developed in relative isolation from any lay and practical concerns (Nicholson, 1989). Marcos Cueto suggests that the atavistic but highly regarded laboratory practices developed by Argentinean physiologists were a direct result of national economic limitations (Cueto, 1994). In the context of 20th-century meteorology, John Lewis’s research explores the careers of Japanese post-war meteorological emigrants (Lewis, 1993),6 while that of Kristine Harper

48

Social Studies of Science 34/1

discusses a ‘cross-pollination’ of the Scandinavian and US weather traditions that shaped the 1950s ‘Meteorology Project’ at the Institute for Advanced Study at Princeton (Harper, 2001).7 Studies like these have brought to light how local professional motivations, research traditions, infrastructural stipulations and mechanisms of communication inform priorities in local scientific work or generate different cognitive responses to the globally recognized set of problems. On one level, such responses are a matter of necessity: ‘nations with volcanoes or tsetse flies’, say Paul Hoch and Jennifer Platt, ‘are more concerned with the problems they cause’ (Hoch & Platt, 1993: 133). For Svante Lindqvist, the Swedish strengths in electronics, solid-state physics and electromagnetic theory can be accounted for by the ‘long and cold’ geography of the country (Lindqvist, 1993). On another level, such responses reflect the operation of a common pool of values, actions and goals that structure local technoscience in the ways of ‘assemblages’. When Watson-Verran & Turnbull (1995: 117) use this term of Deleuze and Guattari, they understand its meaning as ‘an episteme with technologies added but that connotes the ad hoc contingency of a collage in its capacity to embrace a wide variety of incompatible components’. The concept refers to an integration of heterogeneous work into cognitive ‘localities’ engineered ‘from the “motley” collection of practices, instrumentation, theories and people’. In modern technosciences, for example, ‘assemblages’ occur through disciplinary societies, instruments, standardization, rhetoric and publication (Turnbull, 1997: 553).8 The resulting entities, such as books, maps, punch-cards, books and models, enable isolated systems of knowledge to communicate and move in space and time. The history of Yugoslav–US mesoscale weather prediction is an instructive case of an assemblage made up of scientific knowledge, socialist politics, Mediterranean cyclones and IBM computers. Each component played an indispensable role in shaping the physiognomy of meteorological products. Thus, the theoretical work involved in the production of Belgrade models should be regarded as only one of the ‘assembling’ elements. This element, however, provided a foundation, a meeting point, and a receptacle of other exigencies without which the results would not have proceeded in the direction and pace in which they did. Going beyond the ‘assemblage’ issue, I also look into whether the story of the Belgrade models can tell us more about those mechanisms of knowledge transport/ travel, whereby what is ‘local’ in one place becomes ‘local’ in another. Does such proliferation of ‘local’ uses result in a global knowledge? Does a multiplication of local truths yield one global truth? Clearly, the question is about the relationship between a socio-spatial ‘individuality’ of scientific claims that make sense in one context, and a cognitive and otherwise ‘collectivity’ that makes sense everywhere. One of the enduring interests in science studies is in showing how individualities grow into collectivities and how collectivities translate into individualities. The Eta case study takes as a premise that the Yugoslavian models represent distinctive assemblages, but it also offers an explanation on how they ‘travelled’ and addressed the

Jankovic: Science migrations

49

concerns found in alien knowledge spaces. What happens when national/ regional research gets uprooted and thrown into foreign environments? How do scientific products fare in new surroundings?

Planning Weather in Socialist Yugoslavia It is questionable whether the dichotomies between the local and the global, or centre and periphery, can hold in a prima facie ‘international’, ‘collaborative’ and ‘public’ enterprise such as weather prediction. Yet it is widely acknowledged that different regions have unequal capabilities for coping with extreme weather. Such has always been the case in the tropics and other areas outside the mid-latitudes. Due to a difference in atmospheric processes, the models suitable for use in such regions would have to be built independently from those for developed (mid-latitude) countries. This caused a lag in the use of forecasting systems in such regions, not least because of sheer lack of telecommunications or warning system. The net result was a growing separation between the weather-sensitive and weather-insensitive nations. It is difficult to imagine any developed (weather-insensitive) country suffering a disaster similar to the 1970 Bangladesh cyclone, which claimed 500,000 victims (Sah, 1979). To respond to hazards of this scale, less-developed nations have usually considered NWP a crucial component in their science and technology policies. One of the main drawbacks in the growth of local NWP, however, was the discrepancy between the size of the necessary undertaking and the availability of local resources. Should the less-developed nations build their ‘own’ prognostic meteorology or should they wait until such knowledge became available in the North and the West? Early policies generally followed the latter option, in assuming that the internationally available scientific and technological knowledge would help less-developed countries to ‘leap-frog’ certain phases of development and catch up with the ‘early starters’ (Rath, 1990: 1432). This proved to be a naive hope: foreign knowledge turned out to be neither free nor easily applicable to local needs. In fact, importing a ready-made science tended to separate local economy from local science. This led developing nations into a ‘dependency’ relation with the great powers, because ‘by adopting Westernised science and Western organizational forms, less developed countries help to promote comparability and compatibility but not solutions to local problems’ (Shrum & Shenhav, 1995: 632).9 Following the results of analysts such as Kenneth Arrow and A. Gerschenkron in the early 1960s, policy-makers increasingly argued that economic growth of the latecomers ought to be defined as an ‘adaptation’ of foreign technoscience (Arrow, 1962).10 In the developing world, it was suggested, science should be based on the ‘adaptive in-house’ research that would accommodate the local economy, programmes of development, academic infrastructure and public expectations. Emulating affluent neighbours was neither practical nor possible, especially when the correlation between wealth and productivity showed that ‘only wealthy countries can

50

Social Studies of Science 34/1

support the luxury of prestigious “fundamental” research’ (Shrum & Shenhav, 1995: 631). The adaptive in-house research, however, was neither a product of rational policy planning, nor a prerogative of the third world. The Yugoslavian scientific community, for example, evolved within a socialist ideology of scientific labour grafted onto a ‘bourgeois’ (German) university tradition. This syncretism was further shaped by the neutralist politics of the international non-aligned movement, which allowed Yugoslav scholars to exchange ideas on both sides of the Iron Curtain. A juxtaposition of these factors resulted in an organization of scientific work that went hand-in-hand with emerging reliance on adaptive in-house policies.11 It is my argument that the creation of Yugoslavian NWP models embodied these sets of concerns. The work on the Limited Area Primitive Equation model (LAPEM), followed by the Hydrometeorological Institute and Belgrade University model (HIBU) pulled together socialist programmes of education, economic conditions of a socialist country and priorities in weather research, and the science policy shift towards applied research. This local assemblage, however, represented a professional ‘crosspollination’ between the Yugoslav and international meteorological community, as well as the fact that programmes in regional NWP evolved in other parts of the world. It thus would be difficult to argue that the character of Yugoslavian regional modelling was solely due to its local geopolitical origins, given substantial presence of the ‘foreign/international’ know-how. Relying on Star’s insights, I am arguing that these heterogeneous sources gave the Belgrade models that decisive level of theoretical plasticity and coherence that enabled their successful implementation in the world’s many meteorological services. ‘The aggregation of all viewpoints is the source of the robustness in science’ (Star, 1988: 46). Most of LAPEM, HIBU and the Eta model were created by Fedor Mesinger and Zavisa Janjic, lecturers and researchers at the Meteorological Institute of the University of Belgrade.12 Although their research and teaching during the 1970s were in Belgrade, they spent extended periods of study abroad. During the 1960s, Mesinger visited the Department of Meteorology in Darmstadt, the National Centre for Atmospheric Research at Boulder, CO, and from 1968 to 1970 he stayed at the University of California, Los Angeles, working with Akio Arakawa and Yule Mintz. In 1978 he paid a visit to the Geophysical Fluid Dynamics Laboratory at Princeton. In 1984 he worked at the US National Meteorological Center (NMC), where he returned in 1988–89 and where, from 1991, he has been a Visiting Scientist within the programme sponsored by the University Corporation for Atmospheric Research. Following his doctoral work, Zavisa Janjic spent 1975–76 as a Scientist at the European Centre for Medium-Range Weather Forecasts in Reading (Berks., UK), where he worked with Axel Wiin-Nielsen and Robert Sadourny. He spent three months at the University of Hamburg in 1979, and in 1985 briefly stayed at the US NMC, where he returned for extensive periods from 1987 to 1990 as the Principal Scientist of the

Jankovic: Science migrations

51

Scientific Program of the University Corporation for Atmospheric Research. Through this programme, he has been employed at the National Centers for Environmental Protection (NCEP) since 1994; in 1997 he spent six months at the National Center for Atmospheric Research. In the interim periods, Mesinger and Janjic worked at Belgrade Institute of Meteorology where they also developed early LAPEM and HIBU. Belgrade in the 1960s and 1970s was not a major centre of meteorological science. Not a member of the Warsaw Pact or the North Atlantic Treaty Organization, Yugoslavia was a political island of what some described as socialism with a human face, and a leader (with India and Egypt) of the non-aligned movement of some of the countries comprising today’s ‘Third World’. After World War II, scientific and technological policy developed in line with two principles. The first reflected a belief that social revolution could not be achieved without dissolving the State – the principle of anti-etatism, which set the Yugoslavian system against that of USSR and other eastern European countries – and a parallel socialization of production and political power (which set it against the rest of the world). The other principle – ‘decentralization’ – referred to the abolition of functions of central administrative authority and their replacement by so-called self-managing or self-governing bodies. Both principles were to bring about the disappearance of the state apparatus and the expansion of the worker’s responsibility through self-managed enterprises. And they both shaped Yugoslav scientific research and organization (Review, 1976: 15–26). An important feature of Yugoslav science organization before the 1970s was the foundation of ‘institutes’. These were at first run by the National Academy of Sciences and funded from the Federal budget,13 but during the process of decentralization they gradually turned into self-managed bodies funded by Republican money. Some institutes functioned independently, others were connected to universities, and some operated as research and development units within businesses. Regardless of the affiliation, the institutes reflected the 1960s belief that economic growth of developing countries could not be conceived without nurturing ‘indigenous research’ and domestic scientific infrastructure. In 1962, for example, Stevan Dedijer – in line with Arrow’s assessment – noted that every decision of national importance ‘requires not only know-how but also scientific knowledge produced by research performed in the local environment; . . . the importation of foreign specialists to produce [scientific results] is politically and economically intolerable as a long-term arrangement’ (Dedijer, 1963: 64). Pro-indigenous bias could be detected in institutes’ preference for theory over applied research. Authorities portrayed theory as cheap, compatible with teaching, and a first step towards application. This reinforced the ideological stance that Yugoslavian science policy: can and must differ essentially from science policies in countries with capitalist or etatist social orders. In [Yugoslavian] society, research and its

52

Social Studies of Science 34/1

results are placed in the service of all around development and prosperity of society as a whole and of each person individually.14

In other words, securing the greatest possible freedom for researchers as individuals meant that science planning had to remain immune to central dictate. The time allotted to basic research was virtually unlimited. The employees of Belgrade’s Institute of Meteorology, for example, could work on pet projects without the onerous task of seeking outside grants and engage in theoretical research without a fear of penalty for doing what had no obvious relevance for economic development. Yet squaring theory and development was not a walk in the park. Examiners from the Organization for Economic Cooperation and Development observed that if the Yugoslavian institutes existed to let people follow their interests, the stress on theory might be justified. But if the ultimate goal was economic growth, the situation was not favourable. By the mid-1960s, this problem became acute. The Resolution of Scientific Research, passed in 1965, asked the economic sector to provide a larger share of resources for research by encouraging long-term and development-oriented projects. The list of priorities included both the work on application of basic research and an inter-republic coordination of all but ‘autistic’ institutes (Razem, 1994).15 It was at this juncture, in 1971, that Mesinger thought about the shape of the Belgrade’s Institute’s research in NWP and the possibility of applying state-of-the-art research in a project of reasonable manageability in the Yugoslavian and regional context. He picked up the new science policy signals. In formulating a reply to the Federal Weather Bureau’s call (konkurs) for new projects, Mesinger and his colleagues played on the convincing issue of economic weather sensitivity. If some sectors of the economy, it was argued, depended on the cyclone in the western Mediterranean, more accurate forecasts would help avoid its consequences. The resulting proposal asked for a new regional (‘mesoscale’ or ‘Limited Area’) model with instructional, experimental and operational purposes. The proposal intimated that construction of a new model should be a long-term enterprise that would both invigorate educational activities at the Institute and lead to operational application at the Weather Bureau.16 Optimistic in intentions, the Belgrade team was realistic in assessing the situation on the ground. Without a long history in ‘indigenous’ weather modelling,17 the project had to offset a series of drawbacks by coordinating interlocking factors of the scientists’ career plans, teaching and dissemination of NWP. The project leaders had to envision a product that would benefit the national economy, stimulate inter-republican collaboration and attract international attention. All this was still harder given the frustrating lack of computing capabilities and the looming concerns about the viability of local NWP.18 For example, what could be the selling point of a project to build an indigenous Yugoslavian model? Why not use the existing models? What could be gained by spending energy on a model that would run on slow computers and would most likely not meet the standards of already

Jankovic: Science migrations

53

tested models? And if such a model were to become a reality, how would it fare in the international community? The answers were in the so-called regional (or mesoscale) weather prediction. Regional modelling started during the late 1960s as a spin-off of global (hemispheric) circulation models. Because the high complexity used in global modelling put a toll on computing machines – and because such models could not be run at high resolution – some meteorologists turned to more manageable, regional modelling to obtain higher quality forecasts using boundary conditions from global models. Reduction in forecast area allowed for quicker integration and earlier dissemination of forecast, while more numerous observations from smaller areas permitted the use of finer grids. Regional models were considered especially helpful in reproducing small-scale phenomena such as fronts and the influence of topography (orography). In one of the earliest models of this type, Bushby and Timpson simulated frontal rain: they claimed this was ‘one of the first attempts to predict weather, as distinct from pressure patterns and vertical velocity’.19 Several groups worked to reproduce weather conditions over small areas, and important research was done in the early 1970s at the Techniques Development Laboratory of the National Oceanic and Atmospheric Administration (NOAA) and the National Centre for Atmospheric Research.20 In Belgrade, Mesinger outlined an early code (about 400 lines) during the winter of 1973. Janjic made significant changes in 1977, resulting in a larger and roughly twice as fast code known as HIBU (about 1000 lines).21 The models were based on the so-called primitive equations of motion, which at the time included the physical processes in a rudimentary form. Early test runs were carried out at the Yugoslav Federal Weather Bureau from 1975, and the model became operational on 1 January 1978. More substantial improvements took place during the 1980s, following which, in 1993, the Eta model – the descendant of LAPEM and HIBU – replaced the Nested Grid Model (NGM) at the US National Centers for Environmental Prediction (formerly the NMC).22 This rapid rise of the models’ prominence, as I intend to show here, was marked by an unusual complementarity between the promotional and theoretical activities of the Belgrade (and later Washington, DC) team: what the model did with the input data was regarded as important and connected to what it did in the institutional framework of national weather services and vis-a-vis other similar ventures in mesoscale NWP. Building a Belgrade mesoscale model made sense in both local and trans-local context, as was elicited by the earliest promotional activities. Mesinger, for example, thought that the model could become a sub-programme of the Global Atmospheric Research Programme (GARP), a World Meteorological Organization (WMO) project spurred by the interest in improving forecasts.23 When in 1971 the Organizing Committee of GARP began to create research centres, Mesinger decided that the project should be an opportunity for Yugoslavian meteorology to carve its niche in the international arena. He

54

Social Studies of Science 34/1

wrote to the Chairman of the National Committee for Geophysics in Sarajevo that: many of the aims of GARP are of extraordinary importance for a small country like Yugoslavia, and the problems associated with the specific meteorological conditions of the country (that is, the great influence of topography) cannot be expected to be solved by other countries.24

Mesinger’s vision was that of research independence that would also have legitimacy and relevance within a broader web of NWP research activities. Thus from the initial recommendation that the members of WMO submit sub-programme proposals within GARP, he proposed that the Belgrade Institute should lead a project on regional modelling in areas under the influence of mechanical and thermal characteristics of the surface. The Belgrade team thought that establishing a connection between regional NWP and the simulation of topography was crucial: not only would such a connection make sense in the hilly Balkans and stormy Adriatic, but it would also give an opportunity for Yugoslavian scientists to use local idiosyncrasies to build a leadership role in regional NWP. Indeed, at the GARP meeting in Budapest in November 1974, it was pointed out that the mountain effects could not be investigated without the use of a regional model. Conversely, the committee explained that the simulation of mountains required mesoscale modelling. In 1974, after the GARP proposal was approved, the plans were under way to organize a large meeting in Montenegro’s resort St Stephan. The conference was a big success and its report became the basis for the Alpine Experiment.25 The project began to attract international attention. In June 1974 Paul E. Long from NOAA expressed interest to visit Belgrade and learn about the model firsthand.26 Fred Bushby from the Meteorological Office at Bracknell (Berks., UK) produced a comparative analysis of the Belgrade model and 17 other regional models.27 In 1975, Mesinger was elected on the Interim Committee for the European Centre for Medium Range Weather Forecast, which he could use as a platform to establish communication with NWP scholars and spread the word about Belgrade’s results. There was speculation that the project could receive up to US$800,000 from the WMO’s development fund. Others suggested that Belgrade use National Sciences Foundation money for collaboration between the USA and Yugoslavia. Neither of these materialized, but Mesinger and Janjic continued to assemble a network of Balkan meteorologists, hoping to establish Belgrade as a centre for routine forecasts for the region.28 This happened only in the early 1980s, when Yugoslavia became one of WMO’s nine ‘activity centres’ for NWP. This was announced in Geneva in January 1983, where the Commission of Atmospheric Sciences argued that regional models like HIBU ‘were of special interest since they should permit countries with only modest computer resources to make meaningful progress’.29 As a follow-up, the Federal Hydrometeorological Institute of Yugoslavia compiled a catalogue of the then existing regional models ‘for the benefit of developing countries with limited computer resources’.30 Such

Jankovic: Science migrations

55

activities make clear that Belgrade aspired to lead in mesoscale weather modelling that catered for the national services without cutting-edge technological infrastructure. What rationales justified such aspirations? And what other factors led to the implementation of the Eta model?

Building Robustness Frugal Coding One particularly attractive feature of the early model was its ability to give quality results on less powerful computers. In the early 1970s the University of Belgrade used an IBM 360/44 computer (generation of October 1966), but as Mesinger wrote in 1970, ‘it is too small for our needs’ and could be run only in an inappropriately small domain. Faced with this constraint, Mesinger and Janjic opted for an approach that I will call ‘numerical minimalism’: small computer capabilities could be offset by design ingenuity. In other words, if one could not expect to have the means to run large numerical codes with high resolution, one could hope to determine whether these codes could be made smaller without losses in realistic simulation. If the resolution of a new Belgrade model – the resolution refers to how finely the atmospheric processes could be represented on a numerical grid – could not match that used in major centres, ‘more successful design, in particular regarding aspects of special regional significance, could be expected to lead to better forecasts some of the time and hopefully when it really mattered’ (Mesinger, 2000: 88, emphasis added). According to Janjic, limited computing resources played a ‘large role’ in defining conditions to be met by the model: optimization of the algorithms used in model dynamics, simplification of the physical package, and maximal efficacy of the code.31 In practical terms, this meant that the Belgrade team had to find ways of saving computer time by looking at alternative ways in numerical representation of the hydrodynamic equation of motion. In 1976, for example, after running a version of the model in the Weather Bureau’s machine, Janjic found out that it required 5 hours of work for a 24-hour forecast. The only way to shorten this time was to tinker with the code, not with the computer: his solution was to improve the expressions for hydrostatic equation and advection schemes, which accelerated the preparation to only 3 hours (Janjic, 1977). Such theoretically oriented strategic optimization differed from NWP practices in places with solid computing infrastructure. Where there was enough computing power, one did not have incentive to save, short-cut or optimize. For example, US meteorologists favoured finer model resolution, addition of more variables, ad hoc adjustments and faster data crunch. These practices could address the problem of simulation complexity, but they also required larger memory and faster processors. Algorithmic elegance did not seem an urgent issue to pursue. Frederick Shuman, a senior meteorologist at the NMC, noted in 1978 that:

56

Social Studies of Science 34/1

higher model resolution over larger areas requires more computations, but fortunately we have not fully exploited the IBM 360-195 system yet . . . the high speed of the new machinery, relative to the old, is required for advances in modelling, particularly for models with higher resolution.32

The sophistication of early Belgrade models was soon recognized outside Yugoslavia. For example, during 1975 Mesinger corresponded with Ray Bates, a meteorologist at the Irish Service in Dublin on his way to take a WMO post within the United Nations Development Programme in Cairo. Mesinger asked whether Bates would be interested to try the model, describing it as ‘a limited area model, designed to run on a small computer IBM 375/135 with 250K [sic] memory’. Bates knew that the Egyptians had had problems with their model run on an IBM 370/145 with 112 kb and that they hoped the Belgrade-made LAPEM could fit this smaller memory. When the model arrived in Cairo, Bates reported that he and the Egyptian meteorologists were ‘impressed with its elegance’. And when it was about to start running, Bates wrote that ‘[f]rom the longer term point of view, I was wondering if you would be willing to let us use your model in the Irish Meteorological Service’. Mesinger replied with great enthusiasm, and Bates took the model to the Republic of Ireland where it soon became operational.33 In the subsequent adjustments, Mesinger and Janjic further speeded up the forecasts and reduced the numerical errors associated with the inclusion of small-scale weather phenomena. The choice of numerical grids reduced the noise-control, Janjic’s new advection scheme prevented ‘the false energy cascade toward smaller scales’, while the ‘split-explicit’ time differencing scheme proved better than the one used in the then US operational NGM.34 Mesinger and Janjic also abandoned the idea of producing a computer-specific code, which they saw as a threat to the model’s portability and ease of transfer in other technological environments. With these numerical principles, they were eventually able to ‘squeeze’ a decent-size model into the small IBM computer of the Belgrade’s Federal Weather Bureau. But the real surprise came after a version of this model was run on NMC computers in the 1980s, where it turned out that it was numerically more efficient than the competing US model by a factor of four!35 This means that the quality of the Eta model did not come – at least not exclusively – from the use of super-computers. Although the Eta model was eventually run on big machines, its estimated quality (‘skill’) derived from a meticulous optimization of its code. What originally had been a circumstance-specific imperative in the application of meteorological knowledge – coping with mountain weather and small computers – turned out to become a factor in determining the efficacy level of the model’s performance! Mesinger has recently suggested that this unusual circumstance had to do with Belgrade’s relative independence from extramural funding and from the fact that: we still could do some computing at modest computers in Belgrade [and that] our coming from a place of modest resources, did help, in making it

Jankovic: Science migrations

57

possible for us to do well in Belgrade in the seventies, and in contributing significantly to our doing well in crucial comparisons with the [US models] in the late eighties.36

Such perception gives a particular twist to the contention of modern historians of NWP who seem to insist that ‘[t]he effective use of the vastly increased capacity for observing the weather, the maturation of dynamical meteorology, and the great improvement in forecasting technique were all dependent on new calculating technology, principally the electronic computer’ (Nebeker, 1995: 4). But as Jon Agar has recently explained in relation to the crystallographic uses of computer, ‘the computer is merely a fast calculation tool which “allows” rather than “shapes” scientific change’ (Agar, 1998). Topography The shifting emphasis of Yugoslav science policy in the 1970s towards applied research aiming to benefit society provided a convenient context for gaining support for a project in regional NWP. It was understood that new projects would result in the products that would be tailor-made to meet local climate. The Belgrade group realized that the requirements of the new policy could be suitably met by a project that would involve an indepth study of topographic effects on the atmosphere. This was because Yugoslavian weather was claimed to be ‘more determined by mountains than the weather of any other developed country’ because of strong centres of low atmospheric pressure, known as Genoa cyclones.37 The work on this problem began during the mid 1970s and progressed so rapidly that by 1976 Mesinger wrote to his colleagues in Zagreb and Boulder (CO, USA) that Janjic’s improvements placed Genoa cyclogenesis entirely below 500 mbar (50 kPa), which was considered a significant boost. Italian meteorologists took immediate interest in this improved topographic simulation, because, some of them wrote to Mesinger, it was ‘the only way to achieve new results on the lee cyclogenesis problem’.38 But Mesinger and Janjic found that their scheme could do better if it avoided problems associated with the model’s vertical coordinate, the so-called sigma coordinate. Sigma was authored by Norman Phillips in 1957 and used in the first regional model in the USA launched in 1969 (Phillips, 1957).39 Mesinger and Janjic saw that the sigma coordinate generated a drastically unrealistic temperature forecast (Janjic, 1990: 1429). They also found that some of the sigma-caused errors were not due to calculation, but to the nature of the code. In April 1976, Mesinger suggested an alternative. He wrote to a colleague: ‘in the scientific sense, I am in the mountains, where there are all kinds of problems, with the sigma system and others as well. I should perhaps try a new method, mountains made up of three-dimensional boxes of the grid’.40 By 1984, the box-idea matured into the eta coordinate, in which surfaces remained quasi-horizontal and thus tended to reduce the errors inherent to the sigma system (Mesinger & Janjic, 1985). Thomas Black,

58

Social Studies of Science 34/1

one of the scientists at the NMC’s Development Division, described the introduction of the eta coordinate as a ‘simple variation’ of the sigma but one producing ‘a significant numerical benefit’.41 The benefit became apparent in the series of experiments in which the new Eta model could switch between ‘eta’ and ‘sigma’ regimes to allow a comparative analysis. These switches showed that the eta had lower numerical noise and lower standard deviation in the geopotential height errors than the sigma. The results were widely appreciated among the scientists familiar with the drawbacks of Phillips’ coordinate and especially among those who felt that the sigma-driven NGM should be replaced by a new and more promising model.42 Design Philosophy An instant implementation of a new model in the NMC, however, was not to be the most likely event in US science. Changing an operational system at the NMC was not a simple or inexpensive procedure anyway: a part of the National Weather Service and the NOAA, the NMC provided forecast guidance to the Weather Service Forecast Offices, the Federal Aviation Administration, the Department of Defense, private meteorologists and the research community.43 Despite the magnitude of the changes required if a new model was to be introduced, there was a nagging perception that the NMC’s capabilities limped in comparison to those at the UK’s Reading Centre for medium-range forecasting. ‘Certainly’, recalled Ronald McPherson, ‘by the early 1980s NMC was no longer the world leader in numerical weather prediction’.44 Calls were made for a revitalization of the centre. When Mesinger arrived there in 1984, he observed the enthusiasm with which the institution anticipated such revitalization through the work of Norman Phillips, a doyen of US meteorology lured from Massachusetts Institute of Technology to improve the NMC’s NGM. The NGM had been running daily since June 1984, receiving operational status in March 1985.45 The first vector-computer had just been installed at the Center (CYBER 205). Phillips and James Hoke worked to customize an NGM code for the new machine. The NMC exuded enthusiasm and anticipation. Mesinger considered such a climate unfavourable for pushing his and Janjic’s model too hard and too fast, especially as Phillips steered away from Mesinger and Janjic’s approach. Mesinger briefly joined the work on NGM, without missing opportunities to relay his hopes in the Eta model.46 A problem that plagued the NGM was that Phillips and collaborators handled symptoms, not causes. The development of the NGM was turning into a series of numerical adjustments intended to adjust (tune) the model to better approximate observations.47 The practice is common in weather modelling: it is an a posteriori technique of ‘adjusting coefficients in a model to improve the agreement between model results and measurements’. While inevitable, meteorologists generally consider tuning ‘bad empiricism’, since it ‘artificially prevents a model from producing a

Jankovic: Science migrations

59

bad result’.48 In contrast, ‘good empiricism’ simulates atmospheric processes on the basis of appropriate ‘physics’ and actual observations. The ‘good’ approach requires that parameterization remains unchanged during the model’s testing, whereas in tuning parameters could be changed interactively to make the model match all available observations (Petersen, 2000). These adjustments could go on ad infinitum without the need to consider the reasons responsible for deviations. For example, the NGM included the so-called horizontal ‘smoothing’ that corrected several parameters every 30 minutes. As the correction proved insufficient, it was further adjusted by ‘the half-hourly smoothing of fields of sigma surfaces . . . to reduce greatly the detrimental effects of cooling and moistening mountaintops and warming and drying of valleys that such smoothing causes’ (Hoke et al., 1989: 333). It is no secret that the majority of meteorologists believe that the accuracy of a model should increase with its faithfulness in the physical representation of the atmosphere. From such a realist position, tuning can go only so far before it turns into tinkering without physical rationale, threatening to tax the computer’s memory with an endless series of ad hoc corrections. Indeed, those who witnessed the beginnings of the operational changes at the NMC almost invariably suggest that these realist presuppositions played a crucial role in the NMC’s evaluation of the promise of a model. In the NMC’s view, the promise of a model was equivalent to its potential to assimilate principled rather than ad hoc correction. If a new model could offer a more ‘physically plausible’ simulation, it would reduce the need for tuning.49 This reasoning has marked Mesinger’s and Janjic’s work since the early 1970s and it was one of the vital components of their and others’ perception of the Eta model during the 1980s. Janjic’s substantial and original work on the new ‘physics’ has only added to this perception. The freedom to theorize that complemented the idea of ‘indigenous’ research translated into a meticulous and open-ended study of the fundamentals in atmospheric physics and numerical simulation. In Mesinger’s recollection, his work and that of his colleagues had always put the emphasis on why things work rather than on how they work. The Yugoslavian principle of self-management could only encourage this attitude with stimulating decision-making among the Institute’s staff. Selfmanagement entailed a lack of centralization, and this ensured more freedom in critical assessment of the dominant approaches, since the science divested from state control ‘would never reach a financial “point of no return” ’ (Fuller, 1997: 489). In practice this allowed Janjic and Mesinger to venture into unexplored territory, play with alternatives and deliberate over outcomes, without pressure to justify choices or meet deadlines. These opportunities, Mesinger noted, stand in a stark contrast to the style I’d say is typical of universities in the [United] States as I have gotten to know it, where the emphasis was on efficiency, on a lot of homework, on doing things, on doing a lot of them,

60

Social Studies of Science 34/1

and fast. Thus, we would question things which others did not, and we have taken our time to do it.50

This was achieved by a model that not only served as a prediction tool, but also as a vehicle for testing numerical schemes or investigating regional weather by experimental simulations. These could then be used for teaching, in which a specific phenomenon could be illustrated by ‘playing’ with the code elements. For example, unlike their colleagues elsewhere, Mesinger and Janjic treated the smallest resolvable scales of meteorological phenomena in terms of chosen physical principles, reducing tuning to a minimum, a strategy believed to have led to the Eta’s superior precipitation forecasts.51 Janjic’s work on the ‘physics’ of the model added to its overall performance, as did Black’s contribution in the domain of radiation and modification of data archives, graphics and reformatting. Janjic’s important work on numerical expression of horizontal advection, conservation of energy and hydrostatic equation gave the model a degree of physical plausibility that challenged the NGM and helped Eta’s operational use at the NMC. The indigenously theoretical approach of the Belgrade group could thus be said to have inadvertently led to solving the practicalities of Washington’s predicament. It should be remembered that institutional factors played a significant role in the creation and implementation of the Eta model. During the years of crisis, the NMC authorities considered alternatives to the standard modelling research and organization. One of these alternatives was to develop a ‘visitor friendly’ infrastructure, which permitted visiting scientists to bring new ideas and become productive in the short term. Within this scheme – the so-called Visitor Scientist Program – the NMC developed a ‘test facility’ for quasi-operational testing of models produced outside the USA and launched a ‘modular approach’ to allow visiting researchers to work on one common model while staying within their own specialities. The helped 31 scientists 1984–91, including Janjic and Mesinger. It also benefited the NMC. Ronald McPherson thought that it was the foremost catalyser in the NMC’s effort to upgrade its products as it provided the institution with ‘an influx of new blood and new ideas’. He thought that Mesinger and Janjic’s work was particularly important as it led to the NMC’s most successful operational forecast model (the Eta): ‘The Visiting Scientist Programs are essentially responsible for that model. In my judgment, they were perhaps the single most important factor in what I’d call the revitalization of the NMC in the 1980s. The contributions have really been enormous’.52 In addition, the recognition of the Eta’s quality happened at a propitious moment. During the 1990s, mesoscale (regional) forecasting became a top priority in the US Government’s bid to boost forecasting accuracy through its Weather Research Program (USWRP). The USWRP was to ‘reduce the human and economic toll of weather related natural disasters’ by 20–30% through improved understanding and simulation of atmospheric processes. The 1994 Congressional Submission of the USWRP

Jankovic: Science migrations

61

Subcommittee on Atmospheric Research called for a ‘major, coordinated national mesoscale research project to study mechanisms involved in the formation of precipitation in both summer and winter storms in various regions of the Nation’ (US Weather Research Program, 1994). The USWRP entailed a large budget, rising from US$30.4 million in 1994 to US$51.2 million in 1998! Favourably for the Eta model, the 1994 document had a list of ‘science priorities’ that stressed the need to put more energy in understanding mesoscale phenomena, pursue regional and local scale numerical modelling and determine the limits to atmospheric predictability. A particular emphasis was given to improving the precipitation forecasts, one domain in which the Eta model has performed brilliantly.53 Ten years after its implementation in 1993, the Eta model is used for daily weather forecasts in Belgium, Brazil, Egypt, Finland, Greece, Iceland, Malta, Peru, South Africa, Tunisia, the USA and Yugoslavia.54 Since then, it has been shown that the Eta model could usually produce reliable precipitation forecasts and accurately capture small-scale circulation, such as cold fronts and orographic rainfall (Black, 1994). When asked, meteorologists tend to agree that the model has a subtle numerical design and a stronge physical package, and lends itself to physically justifiable alterations. For the NGM to meet these standards, Hoke recounted in an email to Mesinger, an expensive differencing scheme and the hemisphere boundary conditions would have to be reworked, a job that would require redesigning the model from square one.55

Science Migrations The regional NWP research from Belgrade in the 1970s to Washington, DC, and the rest of the world in the 1990s, embodied the theoretical, promotional and political decisions made by Yugoslavian and US meteorologists during the 30 years of the Eta model family. I have tried to show the ways in which the Yugoslavian and US phases of research complemented each other. As the research undertaken in the US was grafted onto the principles laid down in Belgrade, it is clear that the model represents an ensemble of elements defined by the intellectual, institutional and technological conditions in Yugoslavia as well as the USA. While the computing stipulations and research opportunities in Belgrade shaped the approach to coding and physical representation, the model gained in skill and wider recognition only in the infrastructural heaven of a US Government Institution. In other words, the slow importation of technology from the centre to the ‘periphery’ was in this case reversed by a rapid export of knowledge from the ‘periphery’ to the ‘centre’.56 The Eta group’s decisions to optimize small codes, simulate topography and experiment with untried approaches derived from the exigencies of the Balkan climate, Yugoslavian technology, academic division of labour and socialist research policy. It looks as if the collage of these stipulations and strategies exemplified the official view in which ‘science is society’s reply to the challenges of specific historical development’ (Science Policy,

62

Social Studies of Science 34/1

1973: 16). While the original proposal for the LAPEM model in the early 1970s acknowledged technological disadvantages, lack of research tradition and Belgrade’s dependence on larger centres for weather information, subsequent decisions showed that the modellers could, paradoxically, take these drawbacks as postulates and exploit them to their best advantage. The outcome was a product that was theoretically, as well as politically, regional, conforming to socialist ideology and competing on the global NWP market. Following installation in Belgrade, the model travelled outside Yugoslavia. It first became a commodity on a market of lesscomputerized weather services such as those in Egypt and the Republic of Ireland. During the 1980s, however, the NMC’s Visitor Program moved the Eta-model team to the USA, where the local regimes of research on NGM and a subsequent increase in the support for mesoscale modelling led to the model’s implementation in the NCEP. Let us look at two issues exemplified in these events: local/national creation of technoscientific products and global/transnational travels of such products. The career of the Eta model shows that the scientists in less-developed countries could and sometimes did customize their science to meet local contingencies. Mesinger’s and Janjic’s education and specialization in the cutting-edge NWP research did not end in imitation and dependency. They rather worked to domesticate their ‘foreign-acquired’ expertise into an in-house adaptive approach advocated by Yugoslavian policy analysts of the late 1960s. The important consequence was that the early model produced local forecasts at the Belgrade Weather Bureau before it could boast a wider reputation in the global NWP community. Does this suggest that its assessment depended on standards local to the Yugoslavian scientific community? Could it be that meteorology in Belgrade presents a case of a self-sufficient and self-governing enterprise with quality results produced without an imprimatur of the centres of calculation that are said to legitimise all locally produced scientific information? If this is the case, then the account present here may contradict some recent analyses that state that in ‘the local infrastructure of technoscience, most people and things are tied directly into the international science system . . . . Without this connection a scientific locality cannot be taken seriously, no matter the perfection of its assemblage or the quality of work being done’ (Chambers & Gillespie, 2000: 231–32, my italics). There is no contradiction here: the Yugoslav NWP cannot claim the status of an ‘indigenous scientific knowledge’ in the sense of indigenous healing methods or aboriginal navigation. The obvious reason is that even if we assume that the criteria of accreditation of knowledge operate locally, we cannot assume that they must be local. Modern NWP involves a highly esoteric and robust fund of theoretical, empirical and numerical meteorology that renders local research goals practically identical across the globe: to construct a deterministic, numerically calculable code that would make a reliable prediction of future weather in a given locality. This fund – this representational/cognitive dimension of meteorology – is a part of the ‘international system of science’, a system, in the words of Chambers and

Jankovic: Science migrations

63

Gillespie, that ‘formulates priorities for research funding, privileges certain modes of inquiry, sets standards for the size of things, authorizes knowledge claims’ and enables mechanisms of social control (Chambers & Gillespie, 2000: 231). What is crucial to ask in this context, however, is whether we can say more about the system’s origins? How does it come into being and can all local science be said to kowtow to its dictates? The ‘systemic’ model seems to put too much stress on the downward disciplining of the local and too little on the upward construction of the global. To paraphrase Goethe, could it be the case that the system may force, but rarely compel? This is the moral of the Eta story. All origins are local by definition and our understanding of the dialectic between the local embodiments and the macroscience should take into account the nature of the former by looking into the kinds of activities that make different communities develop different takes on shared cognitive and methodological goals. It is the execution of these goals that creates local differences in results. For example, some meteorologists would run their own models on the latest generation computers; some would import the forecasts from elsewhere, others can adopt a program and work it out locally, and yet others might make models with their own resources in mind. It is the result of such idiosyncrasies in research – i.e. of the various ‘vectors of assemblage’ – that characterize the ‘local’ or ‘national’ tradition, or even the local ‘office’ tradition within the same (US) national weather service (Fine, 2002). The only factors that can add up to anything like a ‘localness’ in techno-scientific sites are those that determine: (1) what a product does for an in situ community; and (2) what stipulations there are that configure the execution of the goals circulated within the ‘science system’. The product’s locality does not imply either incommensurability or incommunicability, but the differences in understanding of whether the product had achieved the goals in a native environment.57 My intention in this conclusion is to use the notion of performance in a geographical context of knowledge production. The localness of technoscientific assemblages, I argue, derives from the realm of their performance. More precisely, it lies in those representational/theoretical outcomes that result in the negotiations between research groups and local conditions, which can either thwart, encourage or abort the execution of shared objectives (of, say, numerical forecasting of the weather). But the localness also lies in the performance of these outcomes in, for example, the quality that a NWP product accomplished in a given environment. It is in this sense I have been using the term ‘product’ throughout the present paper, because it points to the micro/macro dialectics in a geographical distribution of knowledge. Its technological implication indicates a move from the conditions of local knowledge production to the conditions of possibility for knowledge ‘travel’, without implying that these are two distinct activities. While ‘product’ implies a terminus, achievement and stability, it also suggests a purpose and a set of conditions necessary for its use. A product is both a result of an action and a prerequisite for a future action:

64

Social Studies of Science 34/1

like cars and scissors, NWP models are designed to perform a specific task. Emphasizing performance/work, however, does not suggest that, to return to our case study, the Eta model outperformed other NWP products like the NGM in any absolute way. Rather, the researchers involved in its development measured its performance against a set of theoretical, institutional, national, technological and economic contingencies of the 1970s and 1980s.58 The Eta model ‘travelled’ because it connected its local origins with its local destinations. In moving from Belgrade to Slovenia to Egypt, the Republic of Ireland, Italy, USA, Malta and Brazil, the Eta family maintained its representational identity (as coded mathematical simulations), while plugging-in its performative power into the new knowledge spaces. In this cognitive diplomacy, the features of the model that were ‘Made in Yugoslavia’ were solving problems as if they were ‘Made Someplace-Else’. The Eta model thus moved because it created a brotherhood of localities with similar conditions of scientific and social life, and which the local authorities believed could be tackled by adopting the model. To illustrate: the frugal coding was a response (and solution) to both Yugoslavian and Egyptian computing predicaments; modelling topography answered both Yugoslavian and Italian interests in ‘Genoa cyclogenesis’. The preference for what some perceived a physically sound ‘simulation’ over ad hoc tuning suited both the socialist culture of basic research and the attempts to resuscitate the NMC. In other words, different agencies shared the model because they shared the concerns that informed its construction. Each of these concerns could then anchor itself in a different habitus of research, making the Eta ‘work’ by enabling its theoretical ‘constant’ to handle the practical ‘variable’. This picture accords with Susan Leigh Star’s distinction between theoretical coherence and plasticity: ‘any scientific theory can be described in two ways, the set of actions that meet local contingencies [plasticity] . . . or the set of actions that preserves continuity in spite of local contingencies [coherence]’. Theories and other entities that preserve both of these aspects are ‘boundary objects’: ‘plastic enough to adapt to local needs and constraints of the several parties employing them, yet robust enough to maintain a common identity across sites’ (Star, 1989: 21). As a boundary object, the Eta model showed a degree of resilience necessary to appeal to several research sites, and a sufficient degree of internal consistency to enable its incremental development into a quality NWP product. Belgrade’s approach was plastic enough to accommodate not only Yugoslavian meteorological needs but, as the WMO stressed during the 1980s, those of the poorer or less-computerized nations as well. And because the ease of use and compatibility played a vital role in the Eta’s implementation, Janjic and Mesinger worked to keep these elements intact by cementing them in the theoretical structure of the code. For example, the Eta’s choice of integration domain and the coding approach reflected Belgrade’s emphasis on portability and plug-in compatibility. This meant that the authors could run the model on any geographical region of interest

Jankovic: Science migrations

65

by specifying the region’s longitude and latitude. To ensure portability and adjustment to the next-generation computers, the authors also avoided the use of computer-specific Cyber 205 Fortran extensions.59 Which is to show that the theoretical and numerical choices that shaped the model’s construction have always transpired from the plans about its potential performance in an anticipated knowledge space. The modern NWP research – and possibly much of the research in earth sciences – suggests that the global and local interleave when the local assemblages and global directives mutually construct each other. In this co-construction, the ‘international science system’ offers a pool of shared cognitive ideals transformed by the work on site. The difference between the perceptions of international ‘frontier’ techno-science and the local ‘backwater’ adaptation is that the former implies a representational notion in which what matters is a black-boxed ‘fund’ of rules and knowledges, while the latter implies a performative agenda in which what matters is exactly that – what locally matters. The outcomes of the local research may (or may not) be plugged-in to the international circulation. Such flow gives space for ‘indigenous’ products – in the limited sense described earlier – but it can also suggest solutions to non-local problems.60 The net effect of this circulation, in words of Andrew Jamison, is that ‘national cultures affect the development of knowledge in decisive ways; and what we call an international science is really a kind of global ecosystem with intricate patterns of knowledge exchange and energy flows’.61 If this recognition dissolves the distinction between national (and peripheral) and international (metropolitan) science, it also challenges the hierarchical order implied in the division between science as a global and as a local phenomenon. The synergetic flows of knowledge and practices that the Eta model has exemplified show that every science product begins as an appropriation and ends (or may end) as an export. What happens in between is a negotiation between ideals and contingencies. This image of bartering saves us the trouble of pondering whether the science in developing countries remains sycophantic to the expertise of the rich or answerable to local needs. While policy-makers used to assume that harnessing western techno-science brings benefits to developing nations, it does not mean that the opposite is less frequent or impossible. The NMC’s Visiting Scientist Program – like many US universities and the big corporations’ research departments – effectively harnesses the non-US products to the benefit of US meteorology. Yet as the Eta case has shown, this view relies on a problematic dichotomy between the central and the peripheral: ‘globalization’ and ‘localization’ are best seen as two sides of the synergetic flow of products that gain cognitive credentials through their performance, adaptation and circulation.

Notes This paper would have not been possible without generous support of many individuals. My foremost gratitude goes to Fedor Mesinger and Zavisa Janjic who have always provided crucial and otherwise inaccessible information about their research. Fedor Mesinger has

66

Social Studies of Science 34/1

kindly allowed me to use his correspondence and has been most helpful in unearthing Etarelated documents and manuscript materials. I was fortunate to work with Karen Michels, librarian at the NCEP Reading Room: she not only helped me find the my way around the collection, but was also kind enough to do her own search and identify items which she thought (rightly) would be relevant, all before my arrival! Special thanks go to Michael Lynch, Jon Agar, Alexis de Greiff, and the four anonymous reviewers for close reading and vital suggestions. I am grateful to Eugenia Kalnay and Thomas Black who have been forthcoming in sharing their memories on the 1980s NMC. For their comments on earlier versions I also wish to thank Jim Fleming and other participants of the first International Commission for the History of Meteorology in Mexico City as well as the members of Tyndall Centre for Climate Change Research and Roger Cooter of the Wellcome Unit at University of East Anglia. Joan Mottram, Lori Hughes, and Bill Luckin have been most helpful with their editorial comments and corrections. 1. Frederik Nebeker (1995: 4) writes: ‘[t]he effective use of the vastly increased capacity for observing the weather, the maturation of dynamical meteorology, and the great improvement in forecasting technique were all dependent on new calculating technology, principally the electronic computer’. 2. Charney et al. (1950). For the early development of NWP see Arakawa (2000) and Cressman (1996). 3. The present paper does not analyse the intellectual weight of individual contributions to the development of the Eta. Over the 30 years, the individuals involved in the project made numerous contributions that differed in profile, relevance and magnitude. My argument, however, does not require these to be treated as anything but elements of a collective effort, and any attempt to read it otherwise would be misdirected. 4. For review see Livingstone (1995); see also Shapin (1998), Ophir & Shapin (1991) and Chambers & Gillespie (2000). 5. Andrew Jamison (1993: 188) writes that Swedish and Danish science communities reveal ‘metaphysical biases . . . influenced by the role(s) that science had played in their countries’ historical development, and, more generally, by their respective countries’ geographical conditions and socio-economic priorities’. 6. Related issues touch on the effects of scientific migration on ‘convergence, synthesis, and innovation’; see Crawford et al. (1993). 7. Harper shows that the project lacked the experts in weather analysis who would provide a ‘physical’ basis on which to mount the mathematical expertise of US scientists. This was changed by the ‘imported’ Scandinavian ‘tag-team’ who had ‘a solid feel for the atmosphere’, and whose move to the USA resulted in a cross-pollination of research traditions that enhanced the project’s chances of success. 8. See also Turnbull (1993–94) and Turnbull (1995). 9. See also Shinn et al. (1997) and Sagasti (1978–79). 10. Arrow argued that the results of the investigations on the experiential basis of learning could demonstrate, on a macro level, the necessity for the establishment of public funds to ensure national ‘customization’ of knowledge that would reflect local priorities. For the evolution of science and technology policy at this time, see Salomon et al. (1994). 11. On the non-aligned movement, see Rubinstein (1970) and Willetts (1978). 12. It should at once be acknowledged that the work on the model has been a matter of collective effort of individuals, not all of whom can be mentioned here. On the Yugoslavian side, there were Dusanka Zupanski and Slobodan Nickovic and on the ‘foreign’ (NCEP) side, Thomas Black, Lech Lobocki, Qingyun Zhao, Fei Chen, Kenneth Mitchell and Brad Ferrier. 13. The Socialist Federative Republic of Yugoslavia was made up of six Republics: Slovenia, Croatia, Bosnia and Herzegovina, Serbia, Montenegro, and Macedonia. For the establishment of similar institutes in India, see Goonatilake (1984): 99. 14. Science Policy (1973): 110. Other publications include Dezeljin (1985, 1966), Maksimovic (1971) and Dedijer (1991).

Jankovic: Science migrations

67

15. Yugoslavian politics and its relation to science policy have not been extensively explored. Anecdotal evidence suggests that the dissolution of the Federal control signalled the Federal Science Fund’s insolvency. Professor Milorad Ristic, retired from the Faculty of Mechanical Engineering at the University of Belgrade, recalled that the idea of business funding lacked a ‘temporal perspective’ because Yugoslavia could not afford to rely on such an approach in long-term projects. The opinions I sampled on the relationship between science and socialist self-management suggest a generally ‘positive’ assessment. The idea of self-choice and self-management seemed to chime with the intuition about intellectual ‘autonomy’, although in some (perhaps many) cases, one’s research freedom was bought at a price of political conformity or keeping a low public profile. On this subject, more information is becoming available through the transcripts of ‘Fonoteka’, the audio archive of the Museum of Science and Technology, Belgrade. 16. ‘The Development of the Numerical [Weather] Prediction on Smaller Scales Focused on a More Detailed Prediction in the Lower Boundary Layer. The application of the Model for Objective Forecast of the Surface Parameters in the West Balkans’, MS signed 15 May 1971. This document cites Mesinger as principal investigator, with collaborators (in order of appearance in the text): Petar Gburcik, Djuro Radinovic, Verica Gburcik, Zavisa Janjic and Nedeljka Mesinger. The model is described in Mesinger & Janjic (1978). 17. Before Mesinger’s and Janjic’s project, the work on NWP at the Institute was in the hands of Petar Gburcik. His quasi-geostrophic model was running experimentally at the Weather Bureau, but it did not use the so-called primitive equations of motion, at the time widely recognized as the future of NWP. 18. When Janjic optimized the original code he took it to Radovljica, Republic of Slovenia, to run on a local IBM 360/40 (generation of May 1963) which, although slower, had a larger memory (Mesinger to Dusan Djuric 9/1/73). In Belgrade, The Weather Bureau operated a CDC 3600 (generation of 1963) and an IBM 360/44, but often without magnetic tapes available, and at a cost of 30p (50 cents) per hour of calculation. By comparison, £100 was a full professor’s monthly salary as well as the cost of 1 m2 of apartment space in downtown Belgrade (Mesinger to Dusan Djuric, 11/8/70). Ironically, when in 1973 the Bureau bought an IBM 370/135 (generation of 1971) with considerable memory (192 kb), the machine proved somewhat inadequate for weather prediction; it was also slower than the older IBM 360/44. In contrast, during his 1975–76 stay at the European Centre for Medium-Range Weather Forecasting (Reading, Berks., UK) Janjic used the powerful IBM 360/195, 2048 kb memory, generation of March 1971. Product information gathered from < http:/ /fms.komkon.org/comp/misc/List.txt > (accessed 13 March 2002). 19. Bushby (1986), quoted in Mesinger (2000). For technical detail, see Kalnay (2002) and Pielke (2002). 20. The first day-to-day operational use in 1972 was at the UK’s Meteorological Office. At the US NMC, the first operational Limited-Area model (the so-called Limited-Area Fine Mesh Model) came to life in February 1971 (Monmonier, 1999: 97). See also Randerson (1976), Keyser & Uccellini (1987), Anthes (1983) and Pielke (1984). 21. A unique feature of the model was its space–time differencing scheme for the divergence term of the continuity equation and pressure gradient term of the equation of motion. The later version featured a new scheme for horizontal advection, conservation of energy, and improvement on Arakawa’s hydrostatic equation (Mesinger & Janjic, 1974; Janjic, 1977). 22. Some of these included, in an approximate order of appearance: the introduction of non-linear advection schemes, the new eta vertical coordinate (leading to the change of name to Eta model), the customization of the code for a vector computer at the NMC and the launch of an advanced representation of surface variables, rain, wet convection and turbulence (Janjic, 1984, 1990, 1994).

68

Social Studies of Science 34/1

23. Nebeker (1995: 174); see also Namias (1968). 24. Mesinger to Abdulah Numinagic, 28 January 1971. On the origins of GARP, see Robinson (1967) and Reed (1971). 25. Mesinger to Dusan Djuric, 21 April 1976. Mesinger and Janjic sought collaboration, for example, from the Croat turbulence specialist Makjanic, the Greek scientists Homer Mantis and Demetrios Lalas, and the Bulgarian meteorologist Stojchov. Connections were established with Turkish workers too. 26. Paul E. Long (NOAA, MD, USA) to Mesinger, 28 June 1974. Joost A. Businger from Washington University gave a talk at the Institute in September on the surface layer airflow. 27. Frederick Bushby to Mesinger, 1 March 1976. 28. Mesinger to Lennart Bengtsson (Stockholm), 19 November 1973; Mesinger to Homer Mantis (Athens, Greece), 11 November 1972; Demetrios Lalas (Athens, Greece) to Mesinger 1 October 1976. 29. WMO Bulletin 32 (1983): 240. 30. WMO Bulletin 34 (1985): 59. See also a report on the meeting of Climate Analysis Section Working Group on Short- and Medium-Range Weather Prediction Research in Belgrade 26–30 August 1985, R.J. Bates presiding, WMO Bulletin 35 (1986): 59–60. 31. Janjic to author, email communication, March 2002. 32. Shuman (1978: 15). It was not obvious that an increased resolution would help. In 1985, Mesinger and Janjic published results showing how high resolution can generate errors in the case of at least two pressure gradient force schemes. ‘[I]ncreasing the vertical resolution had no effect on the error in one case, and was leading to an increase of the error in another’ Mesinger, 2000: 96, referring to results in Mesinger (1982) and Mesinger & Janjic (1985). 33. Bates to Mesinger, 7 March 1976; Bates to Mesinger, 17 August 1976; Mesinger to Bates, 23 September 1976 (Bates & McDonald, 1982). 34. Mesinger & Arakawa (1976), Mesinger (1973) and Janjic (1979). Janjic produced a series of original solutions in a comprehensive ‘physics package’ to the previously ‘minimum physics’ Eta code. 35. Mesinger reminisced that the model’s design ‘compensated for an enormous NGM effort of extracting maximum advantage of the so-called vector-processing and special code features of the NMC’s CYBER 205 computer in the mid-eighties, so that, in formal efficiency, time per number of grid points, we were not behind’, Mesinger to author, personal communication, June 2001. 36. Mesinger to author, email communication May 2001. 37. Mesinger et al. ‘Research Proposal’, 15 May 1971. On Genoa cyclones and the assessment of material damages, see < http://www.cpc.ncep.noaa.gov/products/ assessments/assess_99/europe.html > (accessed 29 November 2003). 38. Stefano Tebaldo to Janjic, 5 January 1977. 39. See also Edwards (2000) and Howcroft (1971). 40. Mesinger to Dusan Djuric, 21 April 1976. 41. In technical terms, the change was about normalization of the vertical coordinate: whereas the sigma was normalized with respect to surface pressure, the eta was normalized with respect to the mean sea-level pressure. This meant that while sigma surfaces closely followed topography, in an eta-driven model, the surfaces were slopeless, the feature that helped ‘computing the pressure gradient force near steeply sloping terrain mountains’ (Black et al., 1988, 1993). For more detail, see Mesinger & Black (1992). 42. Black & Janjic (1988); Janjic et al. (1988: 147). See also Kalnay (1992). 43. For a history and structure of NMC, see NMC Handbook (1990). On the US Weather Service, see Hughes (1970), Whitnah (1961) and Shea (1987). 44. Ronald McPherson, quoted in Report (1994: 12). Kalnay remembered NGM’s ‘not very sophisticated’ numerical schemes, NMC’s ‘isolation’ from the outside world, and

Jankovic: Science migrations

45.

46.

47.

48. 49. 50.

51. 52. 53.

54.

55. 56.

57.

58.

59. 60.

61.

69

even incompetence in the Development Division. Kalnay to author, personal communication, April 2002. The kind of problems the NMC encountered with the Limited Fine Mesh Model parameterization of physical processes is mentioned in Xiao-Rong & Hoke (1985). See also Shuman (1977) and Stackpole (1978). Phillips was not using Arakawa’s grids, and when Mesinger, out of curiosity, ‘translated’ Phillips’s grid into Arakawa’s classification, it turned out to be D grid: the poorest in the family! Mesinger to author, personal communication, June 2001. On NGM research see Junker et al. (1989), in which the authors compare the NGM’s performance in forecasting surface lows and highs, precipitation events and diurnal cycles with that of the Limited Area Fine-Mesh model and found NGM to be more accurate. See also Petersen et al. (1991). One of the technical reasons to freeze NGM was based on the realization that its improvements were constrained by the ‘input for the [rigid] Model Output Statistics guidance’. This meant that any alteration in NGM’s performance would be hurtful to the Model Output Statistics (Kalnay et al., 1992: 4). Here I do not consider the epistemology issues discussed in Norton & Suppe (2001) and Oreskes (2000). Randall & Wielicki (1997: 403, 404), see also Jianjian et al. (1998). On different means to evaluate a model (theoretical underpinnings, pools of data, conceptual constructions) see Oreskes (2000: 37). See also Edwards (2001: 56–7). Mesinger to author, email communication, June 2001. An analogy can be made between the Belgrade’s research climate with that surrounding the work of contemporary amateur astronomers, who, without the pressure to apply for grants or publish, produce high quality results and often collaborate with the professional astronomers (Dyson, 2002). Kalnay to author, personal communication, April 2002; Mesinger et al. (1992): 96. McPherson in Report (1994: 12). See also Kalnay et al. (1992). US Weather Research Program (1994: 65). Among 14 itemized priorities, ranking as the third was one on improving ‘decision aided forecasts’ and data assimilation which was planned to receive US$4.3 million(US$1.3 million more than in 1993). A selected list of major contributions to the operational Eta model includes Janjic (1984), Mesinger & Janjic (1985), Black (1988), Mesinger et al. (1988), Mesinger & Black (1992) and Treadon (1993). For changes in the NGM’s handling of orography and the deficiencies in the Eta model, see Petersen (1992: 13–17). One can see how such reversal leads to a brain-drain in the less advantageous countries. In this respect, the Yugoslavian situation has been exacerbated by the recent civil unrests and wars, the consequences of which are only slowly being recognized and measured in the academic context. This view in many ways accords with Andrew Pickering’s insistence on the ‘performative’ aspect of technoscience: ‘science can be seen as a realm of instruments, devices, machines, and substances that act, perform and do things in the material world’ (Pickering, 1999: 374). This is not to imply that the Eta would attract wider attention without the circulation of its ‘immutable mobiles’, or without the forging of the ‘trust’ that underlies ‘expertise’ and the institutions that produce and vouch for expertise. In fact, scientists often admit that they are arch-managers in that they cannot afford to ignore the professional, institutional and political exigencies that influence their judgement (Latour, 1990: 45; Shapin, 1998: 8). For a fine case study on socialist technological artefacts is Stokes (1997). See Black et al. (1993: 69). The situation is similar to that in post-war Sweden, where ‘[h]igh energy physicists constitute a republic of science where the political unit of the nation is less suitable as a category for historical analysis’ (Widmalm, 1993: 109). See Jamison, 1993: 204.

70

Social Studies of Science 34/1

References Agar, Jon (1998) ‘Digital Science: What Difference Did Computers Make?’ Unpublished paper presented at BSHS New Directions of British Computing conference. Anthes, R.A. (1983) ‘Regional Models of the Atmosphere in Middle Latitudes’, Monthly Weather Review 111: 1306–35. Arakawa, Akio (2000) ‘A Personal Perspective on Early Years of General Circulation Modeling at UCLA’, in David Randall (ed.), General Circulation Model Development (New York: Academic Press): 1–65. Arrow, Kenneth J. (1962) ‘The Economic Implications of Learning by Doing’, Review of Economic Studies 29: 155–73. Bates, J.R. & A. McDonald (1982) ‘Multiply-Upstream, Semi-Lagrangian Advective Schemes: Analysis and Application to a Multi Level Primitive Equation Model’, Monthly Weather Review 110: 1831–42. Black, Thomas L. (1988) The Step-Mountain Eta Coordinate Regional Model: A Documentation (Silver Spring, MD: NOAA/NWS). Black, Thomas L. (1994) ‘The New NMC Mesoscale Eta Model: Description and Forecast Examples’, Weather and Forecasting 9(2): 265–84. Black, Thomas L. & Z.I. Janjic (1988) ‘Preliminary Forecasts Results from a StepMountain Eta Coordinate Regional Model’, presented at the Eighth Conference on NWP (Baltimore, MD: American Meteorological Society). Black, Thomas L., Z.I. Janjic & F. Mesinger (1988) ‘The New NMC Model on Eta Coordinates’, Research Highlights of the NMC Development Division: 1987–1988 (US Department of Commerce, NOAA): 68–71. Black, Thomas L., Dennis Deaven & Geoffrey DiMego (1993) ‘The Step-Mountain Eta Coordinate Model: 80km “Early” Version and Objective Verifications’, Technical Procedures Bulleting, no. 412, 21 May 1993 (US Department of Commerce): 1–7. Bushby, F.H. (1986) ‘A History of Numerical Weather Prediction’, Journal of the Meteorological Society of Japan (special volume on Short and Medium-Range NWP, Collection of Papers Presented at the WMO/IUGG NWP Symposium, Tokyo, 4–8 August): 1–10. Chambers, David Wade & Richard Gillespie (2000) ‘Locality in the History of Science: Colonial Science, Technoscience, and Indigenous Knowledge’, in Roy MacLeod (ed.), Nature and Empire: Science and the Colonial Enterprise (Chicago, IL: University of Chicago Press): 221–40. Charney, J.G., R. Fjortoft & J. von Neumann (1950) ‘Numerical Integration of the Barotropic Vorticity Equation’, Tellus 2: 237–54. Cramer, Jacqueline & Rob Hagendijk (1985) ‘Dutch Fresh-Water Ecology: The Links between National and International Scientific Research’, Minerva 23: 485–503. Crawford, Elisabeth, Terry Shinn & Sverker Sorlin (eds) (1993) ‘The Nationalization and Denationalization of the Sciences: An Introductory Essay’, in E. Crawford, T. Shinn & S. Sorlin (eds) Denationalizing Science: the Context of International Scientific Practice (Dordrecht: Kluwer): 1–42. Cressman, George (1996) ‘The Origins and Rise of Numerical Weather Prediction’, in James Fleming (ed.), Historical Essays on Meteorology, 1919–1995 (Boston, MA: American Meteorological Society): 21–39. Cueto, Marcos (1994) ‘Laboratory Styles in Argentine Physiology’, Isis 82: 228–46. Dedijer, Stevan (1963) ‘Underdeveloped Science in Underdeveloped Countries’, Minerva 2: 61–91. Dedijer, Stevan (1991) ‘The Relationship of Science and Society in FPRY’, Scientia Yugoslavica 16: 129–50. Dezeljin, Josip (1966) ‘The Status, Organization and Financing of Scientific Research in Yugoslavia’, Encyclopedia Moderna 1: 89–91. Dezeljin, Josip (1985) ‘Znanost i Tehnologija u Razvitku Socijalistickog Samoupravnog Drustva’ [Science and Technology in the Development of the Socialist SelfManagement Society], Ekonomski Pregled 36: 493–513.

Jankovic: Science migrations

71

Dyson, Freeman J. (2002) ‘In Praise of Amateurs’, The New York Review of Books (5 December): < http://www.nybooks.com/contents/20021205 > (accessed June 2003). Edwards, Paul N. (2000) ‘A Brief History of Atmospheric Circulation Modeling’, in Davis Randall (ed.), General Circulation Model Development (New York: Academic Press): 67–90. Edwards, Paul N. (2001) ‘Representing the Global Atmosphere: Computer Models, Data, and Knowledge about Climate’, in Clark A. Miller & Paul N. Edwards, Changing the Atmosphere: Expert Knowledge and Environmental Governance (Cambridge, MA: MIT Press): 31–65. Fine, Gary Alan (2002) ‘Authors of the Storm: Some Things a Sociologist has Learned about Meteorologists and the Weather’, unpublished paper, Department of Sociology, Northwestern University. Fuller, Steve (1997) ‘The Secularization of Science and a New Deal for Science Policy’, Futures 29: 483–503. Goonatilake, Susantha (1984) Aborted Discovery: Science and Creativity in the Third World (London: Zed Books). Harper, Kristine (2001) ‘The Scandinavian Tag-Team: Providers of Atmospheric Reality to Numerical Weather Prediction Efforts in the United States (1948–1955)’, paper presented at the XXI Congress for the History of Science, Mexico City. Hoch, Paul & Jennifer Platt (1993) ‘Migration and the Denationalization of Science’, in E. Crawford, T. Shinn & S. Sorlin, Denationalizing Science: The Context of International Scientific Practice (Dordrecht: Kluwer): 133–52. Hoke, James, E. James, Norman Philips, Geoffrey Dimego, James Tuccillo & Joseph Sela (1989) ‘The Regional Analysis and Forecast Systems of the National Meteorological Center’, Weather and Forecasting 4: 323–34. Howcroft, J.G. (1971) ‘Local Forecast Model: Present Status and Preliminary Verification’, NMC Office Note 50, January (National Weather Service, NOAA, US Department of Commerce). Hughes, Patrick (1970) A Century of Weather Service: A History of the Birth and Growth of the NWS (New York: Gordon and Breach). Jamison, Andrew (1993) ‘National Political Cultures and the Exchange of Knowledge: The Case of Systems Ecology’, in E. Crawford, T. Shinn & S. Sorlin, Denationalizing Science: The Context of International Scientific Practice (Dordrecht: Kluwer): 187–208. Janjic, Zavisa (1977) ‘Pressure Gradient Force and Advective Scheme Used in Forecasting with Steep and Small Scale Topography’, Contribution to Atmospheric Physics 50: 186–99. Janjic, Z.I. (1979) ‘Forward-Backward Scheme Modified to Prevent Two-Grid-Interval Noise and its Application in Sigma Coordinate Models’, Contributions to Atmospheric Physics 52: 69–84. Janjic, Zavisa (1984) ‘Nonlinear Advection Schemes and Energy Cascade on SemiStaggered Grids’, Monthly Weather Review 112: 1234–45. Janjic, Z.I. (1990) ‘The Step-Mountain Coordinate: Physical Package’, Monthly Weather Review 118: 1429–43. Janjic, Z.I. (1994) ‘The Step-Mountain Eta Coordinate Model: Further Developments of the Convection, Viscous Sublayer and Turbulence Closure Schemes’, Monthly Weather Review 122: 927–45. Janjic, Z.I., T.L. Black, L. Lazic & F. Mesinger (1988) ‘Forecast Sensitivity to the Choice of the Vertical Coordinate’, Annals of Geophysics 29: 147. Jianjian, Gong, Grace Wahba, Donald R. Johnson & Joseph Tribbia (1998) ‘Adaptive Tuning of Numerical Weather Prediction Models: Simultaneous Estimation of Weighting, Smoothing, and Physical Parameters’, Monthly Weather Review 126: 210–31. Junker, Norman W., James E. Hoke & Richard H. Grumm (1989) ‘Performance of NMC’s Regional Models’, Weather and Forecasting 4(3): 368–91. Kalnay, Eugenia (1992) ‘Operational Numerical Weather Prediction at NMC’, Lecture Notes from the 1992 Colloquium on Operational Environmental Prediction (NOAA).

72

Social Studies of Science 34/1

Kalnay, Eugenia (2002) Atmospheric Modelling, Data Assimilation and Predictability (Cambridge: Cambridge University Press). Kalnay, E., W.E. Baker, M. Kanamitsu & R. Petersen (1992) ‘Overview of the NMC Analysis and Modeling Plans’, Research Highlights of the NMC Development Division: 1989–1991 (NOAA/NWS, US Department of Commerce): May 3–11. Keyser, Daniel & Louis W. Uccellini (1987) ‘Regional Models: Emerging Tools for Synoptic Meteorologists’, Bulletin of the American Meteorological Society 68(4): 306–20. Latour, Bruno (1990) ‘Drawing Things Together’, in M. Lynch & S. Woolgar (eds), Representation in Scientific Practice (Cambridge, MA: MIT Press): 19–68. Lewis, John M. (1993) ‘Meteorologists from the University of Tokyo: Their Exodus to the United States Following World War II’, Bulletin of the American Meteorological Society 74: 1351–60. Lindqvist, Svante (1993) ‘Introductory Essay: Harry Martinson and the Periphery of the Atom’, in S. Lindqvist (ed.), Center on the Periphery: Historical Aspects of 20th-Century Swedish Physics (Canton, MA: Science History Publications:): xi–lv. Livingstone, David N. (1995) ‘The Spaces of Knowledge: Contributions towards a Historical Geography of Science’, Environment and Planning D 13: 5–34. Maksimovic, D. (1971) ‘Resources of Science and Science Policy’, in V. Trickovic (ed.), Science and Technology in the Economic Development of Yugoslavia (Belgrade: Institute of Economic Science): 84–152. Mesinger, F. (1973) ‘A Method for Construction of Second-order Accuracy Difference Schemes Permitting no False Two-grid-interval Wave in the Forecast Fields’, Tellus 25: 444–58. Mesinger, F. (1982) ‘On the Convergence and Error Problems of the Calculation of the Pressure Gradient Force in Sigma Coordinate Models’, Geophysica Astrophysica Fluid Dynamics 19: 105–17. Mesinger, F. (2000) ‘Limited Area Modeling: Beginnings, State of the Art, Outlook’, in 50th Anniversary of NWP: Commemorative Symposium, Book of Lectures (Potsdam: European Meteorological Society): 85–112. Mesinger, F. & A. Arakawa (1976) Numerical Methods Used in Atmospheric Models, GARP Publication Series 17(1) (Geneva: GARP). Mesinger, F. & T. Black (1992) ‘On the Impact on Forecast Accuracy of the Step-mountain (Eta) vs. Sigma Coordinate’, Meteorology and Atmospheric Physics 50: 47–60. Mesinger, F. & Z.I. Janjic (1974) ‘Design and Some Experiments with the Federal Hydrometeorological Institute – University of Belgrade Model’, Manuscript for WMO Scientific Lectures, Commission for Basic Systems, Belgrade, March–April. Mesinger, F. & Z.I. Janjic (1978) ‘Description of a Limited Area Model with Primitive Equations on a Fine Mesh of Points that is Suitable for Objective Prediction of Quantities within the Planetary Boundary Layer. Experimental Evaluation of the Model. Limited Area Forecasting Model with Primitive Equations’, in Research Project ‘Weather Forecasting in Yugoslavia’, vol. 1 (Belgrade: FHMI). Mesinger, F. & Z.I. Janjic (1985) ‘Problems and Numerical Methods of the Incorporation of Mountains in Atmospheric Models’, Lectures in Applied Mathematics 22: 81–120. Mesinger, F., T.L. Black, D.W. Plummer & J.H. Ward (1992) ‘Eta Model Precipitation Forecasts for a Period Including Tropical Storm Alison’, Research Highlights: 87–96. Mesinger, F., Z.I. Janjic, S. Nickovic, D. Gavrilov & D.G. Deaven (1988) ‘The StepMountain Coordinate: Model Description and Performance for Cases of Alpine Lee Cyclogenesis and for a Case of an Appalachian Redevelopment’, Monthly Weather Review 116: 1493–1518. Miller, Clark A. (2001) ‘Scientific Internationalism in American Foreign Policy: The Case of Meteorology, 1947–1958’, in Clark A. Miller & Paul N. Edwards (eds), Changing the Atmosphere: Expert Knowledge and Environmental Governance (Cambridge, MA: MIT Press): 167–92. Miller, Clark A. & Paul N. Edwards (eds) (2001) Changing the Atmosphere: Expert Knowledge and Environmental Governance (Cambridge, MA: MIT Press).

Jankovic: Science migrations

73

Monmonier, Mark (1999) Air Apparent: How Meteorologists Learned to Map, Predict, and Dramatize Weather (Chicago, IL: University of Chicago Press). Namias, Jerome (1968) ‘Long Range Weather Forecasting – History, Current Status and Outlook’, Bulletin of the American Meteorological Society 49: 438–70. National Advisory Committee on Oceans and Atmosphere (1982) The Future of the Nation’s Weather Services: A Special Report to the President and The Congress, July (Washington, DC: NACOA). Nebeker, Frederik (1995) Calculating the Weather: Meteorology in the 20th Century (San Diego, CA: Academic Press). Nicholson, Malcolm (1989) ‘National Styles, Divergent Classifications: A Comparative Study from the History of French and American Plant Ecology’, Knowledge and Society: Studies in the Sociology of Science Past and Present 8: 139–86. NMC Handbook (1990) ‘NMC Handbook, September 30, 1990’ (MS at World Weather Building, Camp Springs, MD). Norton, Stephen & Frederick Suppe (2001) ‘Why Atmospheric Modeling is Good Science’, in Clark A. Miller & Paul N. Edwards (eds), Changing the Atmosphere: Expert Knowledge and Environmental Governance (Cambridge, MA: MIT Press): 67–106. Ophir, Adi & Steven Shapin (1991) ‘The Place of Knowledge: A Methodological Survey’, Science in Context 4: 3–21. Oreskes, Naomi (2000) ‘Why Predict? Historical Perspectives on Prediction in Earth Science’, in Daniel Sarewitz, Roger A. Pielke Jr & Radford Byerly Jr (eds), Prediction: Science, Decision Making and the Future of Nature (Washington, DC: Island Press): 23–40. Petersen, Arthur (2000) ‘Philosophy of Climate Science’, Bulletin of the American Meteorological Society 81: 265–71. Petersen, R. (1992) ‘Review of the Changes in the Regional and Mesoscale Forecasting Systems During 1989–1991’, Research Highlights: 13–17. Petersen, Ralph A. et al. (1991) ‘Changes to NMC’s Regional Analysis and Forecast System’, Weather and Forecasting 6(1): 133–41. Phillips, Norman A. (1957) ‘A Coordinate System Having Some Special Advantages for Numerical Forecasting’, Journal of Meteorology 14: 184–85. Pickering, Andrew (1999) ‘The Mangle of Practice: Agency and Emergence in the Sociology of Science’, in Mario Biagioli (ed.), The Science Studies Reader (New York & London: Routledge): 372–93. Pielke, Roger A. (1984) Mesoscale Meteorological Modeling (New York: Academic Press). Pielke, Roger A. (2002) ‘Mesoscale Atmosphere Modeling’, in Encyclopedia of Physical Science and Technology, 3rd edn (New York: Academic Press), vol. 9, 383–89. Randall, David A. & Bruce A. Wielicki (1997) ‘Measurements, Models, and Hypotheses in the Atmospheric Sciences’, Bulletin of American Meteorological Society 78: 399–406. Randerson, Darryl (1976) ‘Overview of Regional-Scale Numerical Models’, Bulletin of the American Meteorological Society 57: 797–804. Rath, Amitav (1990) ‘Science, Technology and Policy in the Periphery: A Perspective from the Centre’, World Development 11: 1429–43. Razem, Dusan (1994) ‘Radiation Processing in the Former Yugoslavia, 1947–1966: From “Big Science” to Nullity’, Minerva 32: 309–326. Reed, Richard J. (1971) ‘The Effects of GARP and other Future Large Programs on Education and Research in the Atmospheric Sciences’, Bulletin of the American Meteorological Society 52: 458–62. Report (1994) Report of the University Corporation for Atmospheric Research (Boulder, CO: UCAR). Review (1976) Review of National Science Policy: Yugoslavia (OECD: Paris). Robinson, G.D. (1967) ‘Some Current Projects for Global Meteorological Observation and Experiment’, Quarterly Journal of the Royal Meteorological Society 93: 409–418. Rubinstein, Alvin Z. (1970) Yugoslavia and the Nonaligned World (Princeton, NJ: Princeton University Press).

74

Social Studies of Science 34/1

Sagasti, F.R. (1978–79) ‘Toward an Endogenous Scientific and Technological Development for the Third World’, Alternatives 4: 301–16. Sah, Raaj (1979) ‘Priorities of Developing Countries in Weather and Climate’, World Development 7: 337–47. Salomon, Jean-Jacques, Francisco R. Sagasti & Celine Sachs-Jeantet (1994) The Uncertain Quest: Science, Technology and Development (Tokyo: United Nations University Press). Science Policy (1973) Committee for the Co-ordination of Science and Technology, Science Policy in the SFR Yugoslavia (Belgrade). Shapin, Steven (1998) ‘Placing the View from Nowhere: Historical and Sociological Problems in the Location of Science’, Transactions of the Institute of British Geographers NS 23: 5–12. Shea, Eileen L. (ed.) (1987) A History of NOAA (manuscript presented to Historical Council, Department of Commerce). Shinn, Terry, Jack Spaapen & Venni Krishna (1997) ‘Science, Technology, and Society Studies and Development Perspectives in South–North Transactions’, in T. Shinn, J. Spaapen & V. Krishna (eds.), Science and Technology in a Developing World (Dordrecht: Kluwer): 1–36. Shrum, Wesley & Yehouda Shenhav (1995) ‘Science and Technology in Less Developed Countries’, in Sheila Jasanoff, Gerald Markle, James Peterson & Trevor Pinch (eds), Handbook of Science and Technology Studies (Thousand Oaks, CA: SAGE Publications): 627–52. Shuman, Frederick G. (1977) ‘Plans of the National Meteorological Center for Numerical Weather Prediction’, NMC Office Note 144 (April). Shuman, Frederick G. (1978) ‘Numerical Weather Prediction’, Bulletin of the American Meteorological Society 59: 5–17. Shuman, Frederick G. (1989) ‘History of Numerical Weather Prediction at the National Meteorological Center’, Weather and Forecasting 4: 286–96. Stackpole, John D. (1978) ‘How to Pick a New Forecast Model: The Selection of the 7L PE for NMC Operation’, NMC Office Note 191 (October). Star, Susan Leigh (1988) ‘The Structure of Ill-Structured Solutions: Boundary Objects and Heterogeneous Distributed Problems Solving’, in M. Huhns & L. Gasser (eds), Distributed Artificial Intelligence (Menlo Park: Morgan Kauffman): 37–54. Star, Susan Leigh (1989) Regions of Mind: Brain Research and the Quest for Scientific Certainty (Stanford, CA: Stanford University Press). Stokes, Raymond (1997) ‘In Search of the Socialist Artefact: Technology and Ideology in East Germany, 1945–1962’, German History 15: 221–239. Thompson, Philip (1987) ‘The Maturing of the Science’, Bulletin of the American Meteorological Society 68: 631–37. Treadon, R.E. (1993) ‘The NMC Eta Model Post Processor: A Documentation’, NMC Office Note 394 (Silver Spring, MD: NOAA/NWS). Turnbull, David (1993–94) ‘Local Knowledge and Comparative Scientific Traditions’, Knowledge and Policy 6: 29–54. Turnbull, David (1995) ‘Rendering Turbulence Orderly’, Social Studies of Science 25: 9–33. Turnbull, David (1997) ‘Reframing Science and Other Local Knowledge Traditions’, Futures 29(6): 551–62. US Weather Research Program (1994) United States Weather Research Program: Implementation Plan. Congressional (15 January) (Washington, DC: US Government Printing Office). Watson-Verran, Helen & David Turnbull (1995) ‘Science and Other Indigenous Knowledge Systems’, in Sheila Jasanoff, Gerald Markle, James Peterson & Trevor Pinch (eds), Handbook of Science and Technology Studies (Thousand Oaks, CA: SAGE Publications):115–39. Whitnah, Donald R. (1961) A History of U.S. Weather Bureau (Urbana, IL: University of Illinois Press).

Jankovic: Science migrations

75

Widmalm, Sven (1993) ‘Big Science in a Small Country: Sweden and CERN II’, in S. Lindqvist (ed.), Center on the Periphery: Historical Aspects of 20th-Century Swedish Physics (Canton, MA: Science History Publications): 107–140. Willetts, Petter (1978) The Non-Aligned Movement: The Origins of a Third World Alliance (London: Pinter). Xiao-Rong, Guo & James E. Hoke (1985) ‘The Impact of Sensible and Latent Heating on the Prediction of an Intense Extratropical Cyclone’, NMC Office Note 314 (October).

Vladimir Jankovic is a Wellcome Research Lecturer at the Centre for the History of Science, Technology and Medicine, University of Manchester. He has published Reading the Skies: A Cultural History of English Weather, 1650–1830 (University of Chicago Press, 2000) and more recently ‘The Politics of Sky Battles in Early Hanoverian Britain’ (Journal of British Studies, 2002). His current research is in the history of European environment and health. Address: Centre for the History of Science, Technology and Medicine, University of Manchester, Manchester M13 9PL, UK; fax: +44 161 275 5969; email [email protected]