... transfer point for data to and from the actual forecast task . Fig 1 : FMI operational Hirlam integration areas. Shades of grey indicate orography height (metres).
Operational and RCR HIRLAM at FMI Markku Kangas and Niko Sokka Finnish Meteorological Institute
1
Introduction
The main activities at FMI during 2004 have been concentrated on maintaining and developing the RCR HIRLAM (Kangas, 2004) as the FMI operational forecast suite. A significant effort has also been devoted on forecast visualisation by adding new elements and by making the visualisation suite visible to all HIRLAM members through HeXnet. Special emphasis has been laid on forecast product archiving at ECMWF. A continuous record of forecasts, starting from the beginning of the RCR operations at FMI, is now available. A meso-β-model, MBE (Järvenoja, 2005), embedded within the RCR area, has also been implemented. It has a horizontal resolution of 9 km and a dynamic time step of 3 minutes. As an additional activity, the HIRLAM group has been involved in the acquisition process of a new SGI computer (Altix 350, 16 CPU, 64 GB memory), intended to replace CSC-IBM as the FMI operational platform. This report describes the operational HIRLAM environment as well some additional related activities at FMI. First, the operational system is described from both meteorological and technical point of view. Next, the products of the model as well as archiving are discussed. Finally, operational experiences and future plans are briefly touched.
2
Meteorological implementation
Starting from February 2004 the main operational suite at FMI has been RCR, based on HIRLAM reference version 6.3, with the following local implementation differences (as allowed by the RCR agreement): • surface analysis : inclusion of o Baltic SST/ice observations (delivered by the Finnish Institute of Marine Research) o climatological lake observations in Finnish area • full SMS control The horizontal resolution of the suite is 0.2° (22 km) with 40 levels in the vertical direction. The dynamic time step is 6 minutes and horizontal grid size 438 x 336 points, covering the socalled larger FMI Atlantic area (Fig. 1). The boundaries are obtained from ECMFW global model with a time resolution of three hours. The analysis is based on 3D-VAR.
14
Fig 1 : FMI operational Hirlam integration areas. Shades of grey indicate orography height (metres).
In data assimilation, TEMP, PILOT, SYNOP, SHIP, AIREP, and DRIBU observations are used. The cut-off time of observations is 2 hours for the main synoptical hours (00, 06, 12, 18) and 4 hours 20 minutes for the intermediate hours (03, 09, 15, 21). Digital filter is used for initialization. Parallel with the RCR, a meso-β-model suite called MBE (Järvenoja, 2005) has been implemented and run operationally since November 2004. It is embedded with the RCR area and obtains its boundaries from the RCR forecasts at 3 hour intervals. Physically, it is based on the HIRLAM RCR model, with major differences being the horizontal resolution (0.08° or 9 km) and the dynamic time step (3 minutes). The integration area is as shown in Fig. 1.
3
Technical and operational implementation
3.1
Data acquisition and operational control
The centre of the data acquisition as well as SMS control is an SGI computer called Metis located at FMI. It hosts the SMS program and acts as a transfer point for data to and from the actual forecast task .
15
FIMR
Ba l ti c
TEMP, PILOT, SYNOP,
SST
2k
B/d
OBS : SHIP, AIREP, DRIBU /ice
da t
a
90 MB/d
archiving 14.5 GB/d , 13 MBit/s
ECMWF
FMI / Metis
boundaries 1.3 GB/d , 13 MBit/s
boundaries observations
g rin o t g ni mo ortin rep
1.4 GB/d 100 MBit/s
S M S
forecast visualization 13.1 GB/d 100 MBit/s
CSC / IBM 1600
HexNet local archive Fig 2 : RCR suite data flows (d = day , s = second).
Fig. 2 shows the principal data flows in the system. The various observations (SYNOP, TEMP, etc.) as well as the Baltic ice data from the Finnish Institute of Marine Research are first collected through various channels to Metis, manipulated, and then transferred to CSC (the Finnish IT center for science) for the actual calculations. The same applies to the boundary data obtained from ECMWF. After calculations on the CSC supercomputer, the numerical results as well as some graphical products are transferred back to Metis for miscellaneous uses by duty forecasters, researchers, and automated forecast products. Finally, data output as well as selected input is archived at the ECMWF using the ecaccess gateway. A graphical interface to forecast products for monitoring and data visualization is provided and is available also to the HIRLAM community through HeXnet (c.f. Fig. 2).
3.2
Forecast and data assimilation
The model and data assimilation is run at CSC, or the Finnish IT center for science. The computer is IBM eServer Cluster 1600 supercomputer, which consists of 16 IBM p690 nodes, each with 32 1.1 GHz Power4 processors (i.e., total number of 16 x 32 = 512 processors). Its theoretical maximum performance is 2.2 TFlop/s, total memory being 512 GB, and the size of disk system 0.5 TB. One node of 32 processors and 32 GB of shared memory is totally dedicated to FMI. Of these processors, the HIRLAM suite uses 28 for parallel processing, the rest of processors being reserved for other FMI uses. One of the remaining nodes is reserved for backup to be used for HIRLAM if the FMI node is down. 16
HIRLAM runs : UTC time 07:10 07:40 01:10 03:40 13:10 13:40 19:10 19:40 01:50 03:35 07:50 9:35 13:50 15:35 19:50 21:35
RCR MBE 02:20
05:50
8:20
11:50
14:20
17:50
20:20
Fig 3 : Operative HIRLAM scheduling.
As to RCR, main forecast runs of 54 hours takes place at 00, 06, 12 and 18 UTC. The suite wall clock time taken is 1h 45 min (forecast 1h 10min). Intermediate cycles of 6h at 03, 09, 15 and 21 UTC take 30 minutes and 10 minutes each, respectively. With MBE, there are no intermediate cycles. The main runs (54 hours) take 3.5 hours total, with forecast taking 1 hour 15 minutes, i.e. about the same as with RCR. The longer total wall clock time is mainly due to a limitation on MBE runs, which prevents it from entering the actual (parallelized) forecast phase before RCR has finished its own forecast and released the processors. Fig. 3 shows the approximate timing of the forecast routines on IBM.
3.3
Parallel run suite for research and model development
In an effort to ease the task of running tests and experiments, an "RCR_parallel" option has been developed and built into the reference HIRLAM (version 6.3.7), to be used when running experiments at the ECMWF. Enabling this option makes the HIRLAM run employ the same observations, boundaries and boundary strategy (archived at ECMWF) as in the original RCR runs (Eerola, 2005).
3.4
New computing capacity at FMI
In January 2005, FMI purchased new computing capacity in the form of SGI Altix 350 series computer. It has 16 processors and 64 GB memory, but can easily been enlarged to a 64 processor system and even further. The preliminary tests have shown a significant increase in computing power compared to the 32 processor IBM node now used by FMI. At present, the SGI system is running a 54 hour HIRLAM forecast in about 50 minutes compared to 70 minutes on IBM. The parallelization of the analysis part of the forecast has not yet been performed on Altix, however. With it, a twofold increase in lead-time compared to the present IBM system is expected.
4
Data archiving
4.1
ECMWF archive
As defined in the RCR agreement, a specific set of model output and input is archived at ECMWF for use of the HIRLAM project. Instead of large tar files, the data are archived into a 17
descriptive directory structure. The naming of the directories is based on appropriate analysis time, being of the form: ec:/rui/hirlam/RCRa/yyyy/mm/dd/hh where yyyy is the analysis year, mm the calendar month, dd the calendar day, and hh the daily hour. The data that is archived includes: • • • • • •
contents of earlier AR*.tar files contents of earlier VE*.tar files boundary files (all used boundaries and boundary strategy file) observations FIMR Baltic ice data log files
The data are transferred first from CSC to FMI and then to ECMWF using ecaccess and Internet. The reason for this transfer to take place via FMI is that there is no ecaccess gateway at IBM. Typically, files will be in the archive about 6 hours after the nominal analysis time (i.e. 00 data at 6 UTC). Transfer time for a 54h forecast run is 30 min. Operationally, about 12 cycles/month fail to archive automatically. These failures are corrected manually during the next month. As a result of these activities, there are no gaps in archived data during the FMI RCR operations.
4.2
Local archives
With improved resolution, the amount of data to be archived has increased greatly. Because of this, a new local data storage policy has been adopted, leading to significant decrease in the amount of data stored locally. With the main archive repository now being at ECMWF, the local archive at CSC now consists mainly only of the data necessary for forecast verification and other immediate uses. As to MBE suite, a full archive has so far been maintained locally at CSC.
5
Graphical products and monitoring
The graphical interface to forecast products for operational monitoring and easier utilization has been developed further by making it available to all HIRLAM members through HeXnet and by adding new features. The items now included in the monitoring pages include: • Weather maps o animated as well as static maps of temperatures, pressures, wind speeds, pseudo satellite pictures etc. for various forecasts and for smaller and larger area • Baltic sea wind maps • Two types of meteogramms for various locations o variables such as liquid cloud water, turbulent kinetic energy, boundary layer height, wind speed and direction as well as various two-meter temperatures as a function of height and time 18
• Mast verification plots o Sodankylä and Cabauw mast measurement v. HIRLAM and ARPEGE forecasts o HIRLAM forecasts compared against measurements from three television link masts of the Finnish Broadcasting Company (YLE) • Statistical verification o observation verification and log file statistics plots for various areas • Monthly report : monthly model statistics in graphical form o observation statistics o observation verification o field verification o observation coverage o 2m temperature bias verification • monitor window o observation coverage maps o analysis increments o observation/first-guess/analysis statistics o surface analysis observations and fields o environmental maps (e.g. precipitation, snow coverage) • information o archiving o run times o additional model documentation (e.g. contents of output files)
6
Operational experiences and problems
RCR operations at FMI have proceeded in general without a major malfunction or technical problems. The data archiving, based on a tape robot system at CSC, has caused repetitive problems, however, usually in the form of loss of archiving functions, but occasionally also leading to the failure and shutdown of the FMI node at CSC. To correct the situation, operator intervention and manual corrections have been needed. This situation has now persisted over a year, but no satisfactory solution has been found. Also, a model related problem was caused by numerical instabilities encountered mainly in the MBE, but also in the RCR model. These are probably due to very high wind speeds at the upper levels. The only solution found was to reduce the dynamic time step, from 6 to 4 minutes in RCR and from 3 to 2 minutes in MBE.
7.
Conclusions
The operational system for HIRLAM RCR reference forecast model and other operational activities at the Finnish Meteorological Institute have been described. HIRLAM reference version 6.3 has been in operational use since February 2004. The activities at FMI also include implementation and operational use of a meso-β-model MBE with 9 km horizontal resolution (semi operational from May to October, 2004, operational since November, 2004). The monitoring interface has been developed further and made available on-line to all HIRLAM participants. 19
In the future, the FMI activities include a possible transfer of the operational activities from CSC IBM supercomputer to a local SGI-based system, which has proved to be twice as fast as IBM and economically very efficient. A further activity will be the moving of FMI to new premises in September 2005.
References Eerola, K., 2005 : Verification of HIRLAM version 6.3.5 against RCR in autumn conditions. Hirlam Newsletter, 47, 72-90. Järvenoja, S., 2005 : A new meso-β-scale HIRLAM system at FMI. Hirlam Newsletter, 47, 91106. Kangas, M., 2004 : The operational HIRLAM at FMI. Hirlam Newsletter, 45, 15-22.
20