Validation Database for Evaluating Vapor Dispersion Models for Safety Analysis of LNG Facilities Guide to the LNG Model Validation Database Version 12 FINAL REPORT BY: J.R. Stewart, S. Coldrick, C.J. Lea, S.E. Gant, M.J. Ivings Health & Safety Laboratory Buxton, Derbyshire, UK September 2016 © 2016 Fire Protection Research Foundation 1 Batterymarch Park, Quincy, MA 02169-7417, USA Email:
[email protected] | Web: nfpa.org/foundation
This page intentionally left blank
—— Page ii ——
FOREWORD This report provides a description of the newly revised NFPA LNG Model Validation Database, Version 12. The Database is contained in a single Microsoft Excel 2010 spreadsheet and is also available in plain ASCII text. The Fire Protection Research Foundation expresses gratitude to the report authors: James Stewart, Simon Coldrick, Chris Lea, Simon Gant, and Matthew Ivings, who are with Health & Safety Laboratory located in Buxton, Derbyshire, UK. The Research Foundation appreciates the guidance provided by the Project Technical Panelists, the funding provided by the project sponsors, and all others that contributed to this research effort. The content, opinions and conclusions contained in this report are solely those of the authors and do not necessarily represent the views of the Fire Protection Research Foundation, NFPA, Technical Panel or Sponsors. The Foundation makes no guaranty or warranty as to the accuracy or completeness of any information published herein.
About the Fire Protection Research Foundation The Fire Protection Research Foundation plans, manages, and communicates research on a broad range of fire safety issues in collaboration with scientists and laboratories around the world. The Foundation is an affiliate of NFPA.
About the National Fire Protection Association (NFPA) Founded in 1896, NFPA is a global, nonprofit organization devoted to eliminating death, injury, property and economic loss due to fire, electrical and related hazards. The association delivers information and knowledge through more than 300 consensus codes and standards, research, training, education, outreach and advocacy; and by partnering with others who share an interest in furthering the NFPA mission. All NFPA codes and standards can be viewed online for free. NFPA's membership totals more than 65,000 individuals around the world.
Keywords: liquefied natural gas, LNG, dispersion, database Report number: FPRF-2016-26
—— Page iii ——
PROJECT TECHNICAL PANEL
Jay Jablonski, HSB PLC Richard Hoffmann, Hoffmann & Feige Leon Bowdin, Hess LNG LLC Francis Katulak, Distrigas of Massachusetts, LLC Kevin Ritz, Baltimore Gas & Electric Company Jeffery Beale, CH-IV International David Butler, City of Everett Fire Department Andrew Kohourt, FERC Janna Shapiro, NFPA 59A Staff Liaison Phani Raj, Federal Railroad Administration Anay Luketa-Hanlin, Sandia Labs Filippo Gavelli, GexCon US
PROJECT SPONSORS
US Department of Transportation (DOT) Pipeline and Hazardous Materials Safety Administration (PHMSA)
—— Page iv ——
Validation Database for Evaluating Vapor Dispersion Models for Safety Analysis of LNG Facilities Guide to the LNG Model Validation Database Version 12
Author(s): J.R. Stewart, S. Coldrick, C.J. Lea, S.E. Gant, M.J. Ivings
Report Number: MSU/2016/12
HSL: HSE’s Health and Safety Laboratory
Enabling a Better Working World
Validation Database for Evaluating Vapor Dispersion Models for Safety Analysis of LNG Facilities Guide to the LNG Model Validation Database Version 12
Report approved for issue by: Date of Issue: Lead Author: Contributing Author(s):
Customer: Technical Reviewer(s): Editorial Reviewer: Project number:
Dr Bronwen Ley 16th September 2016 J.R. Stewart BSc, MSc, Fluid Dynamics Team Dr S. Coldrick, CEng, MIMechE, Fluid Dynamics Team Dr C.J. Lea CEng, FIMechE, LeaCFD consultants Dr S.E. Gant CEng FIMechE, Fluid Dynamics Team Dr M.J. Ivings CPhys MInstP, Fluid Dynamics Team Oak Ridge National Laboratory (ORNL) Dr C.J. Lea CEng, FIMechE Charles Oakley, CEng FIMechE PE06387
Disclaimer: This report and the work it describes were undertaken by the Health and Safety Laboratory under contract to Oak Ridge National Laboratory. Its contents, including any opinions and/or conclusion expressed or recommendations made, do not necessarily reflect policy or views of the Health and Safety Executive.
ACKNOWLEDGEMENTS The authors would like to express their sincere thanks to the following people for their help in producing this report:
Julie Halliday (Pipelines and Hazardous Materials Safety Administration)
Simon Rose (Oak Ridge National Laboratory)
Andrew Kohout and David Rosenberg (Federal Energy Regulatory Commission)
Michael Schatzmann (Hamburg University)
Lorenzo Mauri (GexCon)
Steven Hanna (Hanna Consultants)
Joseph Chang (US Department of Homeland Security)
Henk Witlox (DNVGL)
Morten Nielsen and Søren Ott (Technical University of Denmark)
Gert König-Langlo (Alfred Wegener Institute for Polar and Marine Research)
iii
FOREWORD 1.
This report provides a description of the newly revised NFPA LNG Model Validation Database, Version 12. The Database is contained in a single Microsoft Excel 2010 spreadsheet and is also available in plain ASCII text.
2.
The main changes from Version 11 to Version 12 of the Database are:
3.
a)
Formulae are now embedded within the Excel spreadsheet to calculate automatically the Concentration Safety Factor and Distance Safety Factor parameters.
b)
Point-wise concentration data have been added for the Maplin Sands and Thorney Island experiments.
c)
Point-wise concentrations of 0.1% v/v or less have been included for all of the experiments (previously, these values were omitted)
d)
Errors in the experimental data have been corrected, notably for the BA-Hamburg wind-tunnel experiments
This report, that accompanies the Database, has also been updated to: a)
Clarify the method used to calculate maximum arc-wise concentrations
b)
Provide a summary of uncertainties in each of the experiments
c)
Provide an audit trail of the modifications to the data in changing from Version 11 to 12
4.
There are no new experiments included in Version 12 of the Database, although there is considerably more data provided on the existing experiments.
5.
Many of these new features have been added to the Database to help users meet the requirements of the Pipelines and Hazardous Materials Safety Administration Advisory Bulletin PHMSA-2010-0226 on “Liquefied Natural Gas Facilities: Obtaining Approval of Alternative Vapor-Gas Dispersion Models” (PHMSA, 2010).
6.
The Database, and the manner in which models are to be applied to predict the trials contained in the Database, may be further refined in light of future developments and experience.
iv
CONTENTS 1
INTRODUCTION ..................................................................................... 1
1.1 1.2
Background ..................................................................................................... 1 Content of the report ....................................................................................... 2
2
SELECTION OF DATASETS .................................................................. 3
3
CONTENT OF THE DATABASE ............................................................ 5
3.1 3.2 3.3 3.4 3.5 3.6
Format .............................................................................................................. 5 Test Cases ....................................................................................................... 5 Test Conditions ............................................................................................... 7 Physical Comparison Parameters .................................................................. 7 Wind-Tunnel Data and Scaling ..................................................................... 12 Statistical Performance Measures (SPMs) .................................................. 13
4
USE OF THE DATABASE .................................................................... 17
4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 4.10 4.11
Model Configuration...................................................................................... 17 Wind Speed .................................................................................................... 17 Wind Direction ............................................................................................... 17 Source Term .................................................................................................. 18 Point-wise Concentrations ........................................................................... 19 Temperatures................................................................................................. 20 Transformed Sensor Locations .................................................................... 20 Averaging Time ............................................................................................. 20 Bifurcated Clouds ......................................................................................... 21 SPM Calculations .......................................................................................... 21 Model Evaluation ........................................................................................... 21
5
MAPLIN SANDS ................................................................................... 22
5.1 5.2 5.3 5.4 5.5 5.6 5.7
Summary ........................................................................................................ 22 Site Description ............................................................................................. 22 Instrumentation ............................................................................................. 23 Data Sources ................................................................................................. 23 Upwind Velocity and Temperature Profiles ................................................. 23 Changes to the Experimental Data in Version 12 ........................................ 23 Experimental and Modeling Uncertainties ................................................... 24
6
BURRO AND COYOTE ......................................................................... 32
6.1 6.2 6.3 6.4 6.5 6.6 6.7 6.8
Summary ........................................................................................................ 32 Site Description ............................................................................................. 32 Instrumentation ............................................................................................. 32 Data Sources ................................................................................................. 33 Upwind Velocity and Temperature Profiles ................................................. 34 Trial Specific Notes ....................................................................................... 34 Changes to the Experimental Data in Version 12 ........................................ 36 Experimental and Modeling Uncertainties ................................................... 39
7
FALCON ................................................................................................ 44
7.1 7.2
Summary ........................................................................................................ 44 Site Description ............................................................................................. 44
7.3 7.4 7.5 7.6 7.7 7.8
Instrumentation ............................................................................................. 45 Data Sources ................................................................................................. 45 Data Extraction .............................................................................................. 45 Upwind Velocity and Temperature Profiles ................................................. 52 Changes to the Experimental Data in Version 12 ........................................ 52 Experimental Uncertainties .......................................................................... 53
8
THORNEY ISLAND ............................................................................... 59
8.1 8.2 8.3 8.4 8.5 8.6
Summary ........................................................................................................ 59 Site Description ............................................................................................. 59 Instrumentation ............................................................................................. 59 Data Sources ................................................................................................. 60 Changes to the Experimental Data in Version 12 ........................................ 60 Experimental and Modeling Uncertainties ................................................... 60
9
CHRC WIND TUNNEL TRIALS ............................................................ 64
9.1 9.2 9.3 9.4 9.5 9.6
Summary ........................................................................................................ 64 Experiment Description ................................................................................ 64 Data Source ................................................................................................... 66 Scaling the Wind Tunnel Data ...................................................................... 66 Changes to the Experimental Data in Version 12 ........................................ 66 Experimental and Modeling Uncertainties ................................................... 67
10
BA-HAMBURG AND BA-TNO TRIALS ................................................ 71
10.1 10.2 10.3 10.4 10.5 10.6 10.7
Summary ........................................................................................................ 71 BA-Hamburg Test Description ..................................................................... 71 BA-TNO Test Description.............................................................................. 72 Data Source ................................................................................................... 74 Scaling Relations .......................................................................................... 74 Changes to Experimental Data in Version 12 .............................................. 75 Experimental and Modeling Uncertainties ................................................... 89
11
DATA VERIFICATION .......................................................................... 93
11.1 11.2 11.3
Checks on the Experimental Data ................................................................ 93 Checks on the Formulae in the Excel Spreadsheet .................................... 93 Other Tests .................................................................................................... 94
12
CLOSING REMARKS ........................................................................... 95
13
APPENDIX A ......................................................................................... 96
14
APPENDIX B ....................................................................................... 103
15
REFERENCES .................................................................................... 106
vi
1 1.1
INTRODUCTION
BACKGROUND
7.
In 2006, the Fire Protection Research Foundation (FPRF) undertook a research project for the National Fire Protection Agency (NFPA) Liquefied Natural Gas (LNG) Technical Committee to develop tools for evaluating LNG dispersion models. The work was carried out by the Health and Safety Laboratory (HSL), a directorate of the UK Health and Safety Executive (HSE). HSL developed the LNG Model Evaluation Protocol (MEP), which contained a structure for complete evaluation of LNG dispersion models (Ivings et al., 2007). A partial evaluation of some common dispersion models was also carried out.
8.
One of the key elements of the LNG MEP is validation. Ivings et al. (2007) defined the validation requirements for application of the LNG MEP, which required the construction of a Model Validation Database. The purpose of the Database was to present experimental data and model input conditions for a range of dispersion experiments relevant to LNG dispersion. To evaluate a model using the LNG MEP, users were required to simulate each of the experiments and compare model predictions to data presented in the Database using Statistical Performance Measures (SPMs).
9.
Subsequently, in 2008, HSL was contracted by FPRF to construct the LNG Model Validation Database. The first version was produced in 2008 and it was later revised in 2009 and 2010 to correct various errors in the data. The Database consisted of a single Microsoft Excel 2010 spreadsheet, which was also available in plain ASCII text. Up until 2016, the working version of the Database was Version 11 (Coldrick et al., 2010).
10.
Shortly after Version 11 was published, the Pipelines and Hazardous Materials Safety Administration (PHMSA) issued an Advisory Bulletin that clarified various issues relating to how the Database should be used in practice (PHMSA, 2010). This included:
11.
a)
A description of how the maximum arc-wise concentrations should be calculated (this had not been defined by Coldrick et al., 2010).
b)
The requirement to calculate three new SPMs: the Concentration Safety Factor (CSF), Concentration Factor to Lower Flammability Limit (CSFLFL) and Distance Safety Factor to LFL (DSFLFL) – see Section 3 for details.
c)
The requirement for users of the Model Validation Database to consider uncertainties in both the experimental data and their model, and for them to present the results from sensitivity tests to address these uncertainties.
The new Version 12 Model Validation Database (which is described in this report) incorporates a substantial number of features to help users of the Database address the requirements of the PHMSA Advisory Bulletin. This includes: a)
Formulae embedded within the Excel spreadsheet version of the Database that automatically calculate the new SPMs (CSF, CSFLFL, and DSFLFL)
b)
Point-wise concentration data for the Maplin Sands and Thorney Island experiments (in the latter case, from data newly retrieved from the HSE archives)
c)
Point-wise concentration data for measured concentrations below 0.1% v/v (these were omitted from the previous versions of the Database)1
d)
A short description of uncertainties for each of the experiments
12.
The principal reason for including the new point-wise data (Items b and c in the above list) was to provide the users of the Database with the locations of the sensors used in the experiments, which they need in order to calculate the maximum arc-wise concentrations.
13.
In addition to these substantial changes to the Database, further errors in the experimental data are corrected in Version 12. For the BA-Hamburg experiments, this has involved discussions with the experts in charge of the original experiments in the 1990’s and detailed examination of the original data reports.
14.
In addition to revising the Database, changes have been made to the original MEP report, which has been reissued (Ivings et al., 2016). The MEP report provides useful background to the Database and it should be read in conjunction with the present report. 1.2
CONTENT OF THE REPORT
15.
This report provides all of the information needed to carry out the validation of models against the LNG Model Validation Database version 12. Further details, background information and the justification for the approach outlined are provided in the MEP report, Ivings et al. (2016).
16.
The next section defines the criteria that were originally used to select the experiments in the Database by Ivings et al. (2007). Section 3 then provides a description of the common elements of the experiments presented in the Database, such as the calculation of the SPMs. Practical guidance on the use of the Database is then given in Section 4. Following this, Sections 5 to 10 present details of each of the experiments contained in the Database. This includes new material on experimental uncertainties. The final section describes the verification process that was used to check the data in the Version 12 Database.
17.
In this new version of the Database, some of the experimental data in the field-scale tests and the BA-Hamburg wind tunnel tests have changed. To provide an audit trail of these modifications, a dedicated sub-section on this topic is included in the description of each of the relevant experiments.
1
However, concentrations equal to or less than 0.1% v/v are not used in the SPM calculations.
Page 2 of 116
2 18.
SELECTION OF DATASETS
The datasets presented in the Database were carefully scrutinized and selected by Ivings et al. (2007) to meet the following requirements: a)
The quality of the data must be fit for purpose, i.e. model evaluation
b)
The data must encompass the main physical processes involved in the dispersion of LNG
c)
The test conditions must be known: i.e. the source configuration, release rate, atmospheric conditions, surface roughness, etc.
d)
The time-averaging applied to the data must be specified. Since the focus of the LNG MEP is on flammable hazards, the data should ideally be available for short time-averages (e.g. of the order of one second)
e)
Scaling effects are important in wind-tunnel tests and scale factors should be within acceptable ranges (Meroney & Neff, 1982)
f)
The data must be freely available and presented in suitable formats.
19.
Datasets which met the above requirements were drawn from both field-scale and wind-tunnel experiments.
20.
The field-scale experiments selected by Ivings et al. (2007) primarily involved releases of LNG (Maplin Sands, Burro, Coyote and Falcon), supplemented by high quality data from the Thorney Island releases of Freon gas (McQuaid, 1987). These Thorney Island experiments included releases in stable atmospheric conditions, whereas the LNG field-scale experiments were largely restricted to neutral or unstable conditions. All of these field-scale experiments were performed over unobstructed terrain, with the exception of the Falcon trials in which a large fence surrounded the LNG source.
21.
To ensure that the Database is appropriate for the evaluation of LNG dispersion models that account for the effect of obstructions, data was also included from wind-tunnel experiments of dense-gas releases in the presence of obstacles. This wind-tunnel data included experiments undertaken at the Chemical Hazards Research Center at the University of Arkansas (Havens & Spicer, 2005, 2006; Havens et al. 2007), as well as two test series carried out by European Commission-funded projects, referred to as BA-Hamburg and BA-TNO (Nielsen & Ott, 1996; Schatzmann et al., 1991).
22.
From the five field-scale experiments (Maplin Sands, Burro, Coyote, Falcon and Thorney Island) and three sets of wind-tunnel experiments (CHRC, BA-Hamburg and BA-TNO) a total of 33 test configurations were selected for the Database by Ivings et al. (2007). Table 1, provides an overview of the test configurations.
Page 3 of 116
Table 1 Summary of the trials and test configuration included in the Database
Trial
Maplin Sands, 1980
Field (F) or Wind tunnel (WT) F
Burro, 1980
F
Coyote, 1981
F
Falcon, 1987
F
Thorney Island 1982-4
F
CHRC, 2006
WT
BA-Hamburg
WT
BA-TNO
WT
Trial/Case number and/or description
Data source
27 dispersion over sea 34 dispersion over sea 35 dispersion over sea 3 7 8 9 3 5 6
MDA (Hanna et al., 1993a). Also data reports (Colenbrander et al., 1984a, b, c), Ermak et al. (1988). REDIPHEM (Nielsen & Ott, 1996). Also MDA, Burro data report (Koopman et al., 1982a,b) and Ermak et al., 1988.
1 3 4 45 – continuous release 47 – continuous release
Data report (Brown et al., 1990)
A – without obstacles B – with storage tank & dike C – with dike Unobstructed DA0120/DAT223 Upwind fence 039051/039072 Downwind fence DA0501/DA0532 Circular fence 039094/...095/…097 Slope DAT647/...631/...632/…637 TUV01 - unobstructed TUV02 – downwind fence FLS – 3-D mapping
Page 4 of 116
REDIPHEM. Also MDA, Coyote data report (Goldwire et al., 1983b), Morgan et al., 1984, and Ermak et al., 1988.
MDA. Also Ermak et al. (1988) and HSE data archive. Havens & Spicer (2005, 2006) Havens et al. ( 2007) & CHRC – University of Arkansas. REDIPHEM. Also see Schatzmann et al (1991), Nielsen & Ott (1996) and Marotzke (1993). REDIPHEM. Also see Nielsen & Ott (1996).
3 3.1
CONTENT OF THE DATABASE
FORMAT
23.
The Version 12 Database is contained in a single Microsoft Excel 2010 spreadsheet, which comprises 37 worksheets. The first worksheet is a Key which summarizes the test cases in the Database.
24.
Each test case occupies its own worksheet. These worksheets are of broadly similar construction, but the exact format depends on the nature of the experiment and the test data. Details of the test conditions are provided in each worksheet, together with concentration measurements and, if applicable, temperature measurements. For the wind-tunnel tests, measurements are provided at both wind-tunnel scale and equivalent field-scale (obtained from the scaling rules described in Sections 9.4 and 10.5).
25.
Each worksheet contains an area for model output data to be added. To avoid measurement data or other information being accidently overwritten, data can only be written into certain cells and the rest of the spreadsheet is protected and can only be edited once a password has been entered.
26.
The model output data is used to determine SPMs that are calculated automatically using formulae embedded within the Excel spreadsheet. SPM values are tabulated in two worksheets near the end of the spreadsheet labelled “35# SPM – Individual” and “36# SPM – Grouped Data”. The first of these contains SPMs for each test in each experimental series (e.g. Burro 3, Burro 7, and Burro 8) and the second contains SPMs grouped in terms of similar physics (e.g. obstructed and unobstructed field-scale tests). To calculate these SPMs, another worksheet labelled “34# – Results” compiles together all of the calculations and interim results.
27.
The final worksheet of the spreadsheet, labelled “37# Scatter Plots”, contains a collection of scatter plots showing measured against predicted maximum arc-wise gas concentration. A total of five scatter plots are generated in the Database v12 showing both short and long time-averaged results (where appropriate) for the following groups of trials: all trials, wind-tunnel trials, field trials, unobstructed trials and obstructed trials.
28.
In addition to the Excel spreadsheet, the Database is available in plain ASCII text files that replicate the contents of all the worksheets. The purpose of making the data available in this alternative format is to ensure that the Database is accessible across a range of hardware and software, both now and in the future. 3.2
TEST CASES
29.
The test cases which comprise the Database are shown in Table 2 in the order in which they appear as worksheets in the Excel spreadsheet.
30.
The data is classified according to whether the test cases consist of releases which are unobstructed (Group 1) or obstructed by the presence of obstacles (Group 2). In Table 2, unobstructed cases are set against a blue colored background.
31.
Table 2 shows that for the BA-Hamburg wind tunnel data there are multiple test cases listed for what appears to be the same configuration. In most of these cases, the test is identical except for a shift in the location of concentration sensors.
Page 5 of 116
Table 2 Experimental series and test cases included in the Database Trial Name
Sheet number in the Database
Maplin Sands, 1980
1
27
2
34
3
35
4
3
5
7
6
8
E
7
9
D
8
3
Burro, 1980
Coyote, 1981
Test number
Field (F) or Wind Tunnel (WT)
Obstructed (O) or Unobstructed (U)
Atmospheric stability
Substance released
Dispersion over water (W) or land (L)
F
U
C-D
LNG
W
LNG
L
LNG
L
LNG
L
Freon 12 & Nitrogen
L
Carbon Dioxide
L
Sulfur Hexafluoride
L
Sulfur Hexafluoride
L
D D F
U
B D
F
U
B-C
9
5
10
6
11
1
12
3
D
13
4
D-E
Thorney Island 1982-4
14
45
15
47
CHRC, 2006
16
A
Falcon, 1987
BAHamburg
D F
O
G
F
U
E-F
WT
U
D
F
17
B
O
D
18
C
O
D
19
U
D
U
D
O
D
O
D
O
D
O
D
O
D
O
D
U
D
U
D
U
D
U
D
31
Unobstructed (DA0120) Unobstructed (DAT223) Upwind fence (039051) Upwind fence (039072) Downwind fence (DA0501) Downwind fence (DA0532) Circular fence (039094/039095) Circular fence (039097) Slope (DAT647) Slope (DAT631) Slope (DAT632) Slope (DAT637) TUV01
U
D
32
TUV02
O
D
33
FLS
U
D
20 21 22 23 24 25 26 27 28 29 30 BA-TNO
C-D
WT
WT
Page 6 of 116
3.3 32.
Each worksheet contains headings with associated entries summarizing the test conditions as follows: a)
Trial name – The recognized name of the experimental series, i.e. Burro, Coyote, etc.
b)
Test identifier – A simplified identifier of the test within the experimental series
c)
Date of test – The date of the experiment
d)
Origin of data and date of inclusion – The data sources used, and the date of entry into the Database
e)
Test description – A brief note describing the nature of the experiment
f)
Substance released – Information relating to the physical and chemical properties of the substance released
g)
Release conditions – Information relating to the storage and release conditions
h)
Atmospheric conditions – Information relating to the atmospheric conditions (e.g. wind speed, atmospheric stability) and details of their measurement
i)
Terrain and obstacles – Details of the terrain and obstacles, where this is straightforward. In complex cases (e.g. the terrain elevations for the Burro and Coyote experiments), references are provided for where this can be found in the original data reports
j)
Physical comparison parameters – These entries contain the point-wise and arc-wise data describing the cloud and the locations of measurements, as well as the associated averaging times
k)
Units – SI units are provided for all physical quantities
3.4 33.
34.
TEST CONDITIONS
PHYSICAL COMPARISON PARAMETERS
Model predictions are compared to experimental data in the Database using the following physical comparison parameters: i.
Point-wise concentrations
ii.
Maximum arc-wise concentrations
iii.
Cloud widths
iv.
Predicted distances to the measured maximum arc-wise concentrations
v.
Distances to the LFL concentration
vi.
Predicted concentration at the measured distance to the LFL
The previous Version 11 Database (Coldrick et al., 2010) included only the first four of these parameters. The final two parameters are needed to calculate the Concentration Safety Factor and
Page 7 of 116
Distance Safety Factor to LFL (CSFLFL and DSFLFL), which are requirements following the PHMSA Advisory Bulletin (PHMSA, 2010). 35.
Details on the physical comparison parameters, the averaging times and concentration thresholds are provided below. 3.4.1
Point-wise Concentrations
36.
The Database contains measurements of concentrations at specific gas sensor locations in each of the experiments, which are collectively called “point-wise” concentrations. The location of the sensors and the averaging times are given for each point-wise concentration in the Excel spreadsheet. A figure is also provided that shows the sensor locations (and wind direction) relative to the source location for each trial.
37.
The Database user is expected to input into the relevant cell of the spreadsheet the model’s predicted concentration at each of these point-wise locations, having used an appropriate averaging time in the model. The spreadsheet then automatically calculates all of the other performance comparison parameters and SPMs using these point-wise concentrations. 3.4.2
Maximum Arc-wise Concentrations
38.
For each given arc distance, the maximum arc-wise concentration is taken as the maximum of the point-wise concentrations on that arc. The point-wise locations are those of the gas sensors used during the experiment. This approach is applied consistently to both measured and predicted data.
39.
The Database automatically calculates the predicted maximum arc-wise gas concentration from the maximum of the predicted point-wise concentrations on a given measurement arc.
40.
It is important to note that the model should use the mean wind direction that was measured in the experiments (rather than assume the wind is directed along the centerline of the array of sensors). Due to wind meandering effects and turbulent fluctuations in the dispersing cloud, the location of the measured and predicted maximum arc-wise concentrations may differ. 3.4.3
41.
Cloud Width
For both the measured and predicted data, the cloud width is determined in the Database using the following formula, which is derived from standard deviation of a frequency distribution (Pasquill 1977):
𝜎𝑦2 =
∑ 𝐶𝑦 2 ∑𝐶
∑ 𝐶𝑦 2
−[∑ ] 𝐶
(3.1)
where 𝜎𝑦 is the cloud width, C is the long time-averaged concentration, y is the crosswind displacement of each sensor and the summation (indicated by Σ) is performed over the point-wise values from the lowest sensor height on each measurement arc. 42.
Following a similar approach to that taken by Hanna et al. (1993a), three conditions must be met before a measured cloud width is determined: a)
There must be at least four sensors on an arc that register long time-averaged concentrations greater than 0.1% v/v
Page 8 of 116
b)
The sensor that registers the maximum long time-averaged concentration must not be located at either end of an arc
c)
The lateral concentration distribution must not exhibit a bi-modal pattern with two peaks
43.
Model predicted cloud widths are only calculated in the Database where a measured value exists.
44.
To satisfy the third condition in the above list, the Database user is required to indicate whether there is a bi-modal pattern in the predicted long time-averaged point-wise concentrations on each arc by entering “Y” or “N” into the appropriate box in the Database entitled “Is the cloud bifurcated along the arc?”.
45.
If “Y” is entered into this cell, no cloud width will be calculated, whereas if “N” is entered a cloud width will be calculated. If the cell is left blank and the user enters neither “Y” nor “N”, then no cloud width value will be calculated.
46.
The MEP report (Ivings et al, 2016) provides further information on bifurcated clouds. The example concentration profiles shown in Figure 1 illustrate a cloud that is bifurcated and one that is not bifurcated.
Figure 1 Examples of bifurcated and non-bifurcated clouds (left and right, respectively)
3.4.4 47.
Predicted Distance to the Measured Maximum arc-wise Concentration
The concept of the predicted distance to the measured maximum arc-wise concentration is illustrated in Figure 2. The Database calculates this parameter automatically and assumes that the predicted concentration decays with distance between arcs according to a power-law: 𝐶 = 𝐴𝑥 −𝐵
(3.2)
where C is the predicted maximum arc-wise concentration, x is the distance downstream from the source and A and B are constants whose values are determined by fitting the curve between the maximum arc-wise concentrations at two neighboring arcs. The slope of the curve may not necessarily be continuous with distance downwind, since it is based on a piecewise fit between concentrations at neighboring arcs.
Page 9 of 116
Predicted distance to this measured concentration is 184 m
Power-law fit between neighbouring points
Figure 2 Illustration of the calculation method used to determine the predicted distance to the measured maximum arc-wise concentration. Symbols are: measurements ▲ model predictions
48.
In the Excel spreadsheet, the procedure for calculating the predicted distance to the measured maximum arc-wise is as follows: a)
The first measured maximum arc-wise concentration nearest to the source is selected
b)
A check is made to ensure that this concentration lies within the concentration range of the dataset of predicted maximum arc-wise concentrations. If it does not, then no value of the distance is computed (i.e. concentrations are only interpolated, not extrapolated)
c)
The pair of maximum arc-wise concentration values that span the measured concentration is identified
d)
The natural logarithm is taken of both the arc distances and the maximum arc-wise concentrations for the pair of values found in step c) and a straight trend line is fitted between these two (loge) data points
e)
Linear interpolation along the trend line produced in step d) is used to find the distance at which the predicted concentration equals the measured value.
The procedure outlined in steps a) – e) is then repeated for each of the measured maximum arc-wise concentrations. 49.
Verification tests on this calculation method demonstrated that if the predicted maximum arc-wise concentrations were the same as the measured values then the distances to the measured maximum arc-wise concentrations were exactly equal to the arc locations (as expected).
Page 10 of 116
3.4.5
Distance to the LFL Concentration
50.
Measured and predicted distances to the LFL are automatically calculated in the Database v12 using a similar method to that described above in Section 3.4.4, in which the maximum arc-wise concentration is assumed to decay as a power-law between neighboring arcs. The main difference in the calculation procedure to that described in Section 3.4.4 is that the LFL concentration of 5% v/v (for LNG) is used instead of the measured maximum arc-wise concentration.
51.
Extrapolation is not used to extend the curve of measured or predicted maximum arc-wise concentrations beyond the arcs. If the LFL falls outside of the range of measured or predicted concentrations, no value of the distance is computed.
52.
There is some uncertainty associated with comparing measured and predicted distances to the LFL concentration. The primary issue is that this distance was not a directly measured quantity during any of the LNG release trials included in the Database. As such, the measured concentration data must be processed in order to determine the measured distance to the LFL. Different interpolation techniques are liable to yield different results. More information is given in the LNG MEP (Ivings et al., 2016). 3.4.6
Predicted Concentration at the Measured Distance to the LFL
53.
The predicted concentration at the measured distance to the LFL is calculated automatically in the Database assuming the same power-law decay in predicted maximum arc-wise concentration as that described above in Section 3.4.4. Once the measured distance to the LFL is found from the calculation described in Section 3.4.5, the predicted maximum arc-wise concentration is found at that location by interpolation.
54.
Extrapolation is not used to extend the curve of predicted maximum arc-wise concentrations beyond the arcs. Instead, if the measured distance to the LFL falls outside of the arc locations, no value is computed. 3.4.7
Averaging Times
55.
In the Burro, Coyote and Falcon experiments, the physical comparison parameters are provided for two averaging times: a ‘short’ and a ‘long’ time-average (with the exception of the cloud width, which is only calculated using the long time-average). The short averaging time is one second and the long averaging time is comparable to the duration of the steady period of the release. Two different methods are used to process the data using these averaging times: either a fixed averagingtime window or a rolling average (for details, see Section 6.8.8).
56.
For the Maplin Sands experiments, physical comparison parameters are calculated using a 3 s average, and for Thorney Island a 30 s average. For the wind-tunnel tests (CHRC, BA-Hamburg and BA-TNO), long time-averages are used. For each experiment, the averaging time is stated in the Excel spreadsheet.
57.
A short discussion on the relevance and consequences of averaging time for the evaluation of LNG dispersion models is provided in the LNG MEP report (Ivings et al., 2016). More information on averaging times can also be found in the book by Hanna et al. (1996).
Page 11 of 116
3.4.8
Lower Limit (Threshold) Concentration
58.
In the previous Version 11 Database, any measured concentrations less than 0.1% v/v were omitted from the Database, since it was deemed that the measurement accuracy of low concentrations was uncertain. There were concerns that small measurement errors in absolute terms of, say, 0.05% v/v gas concentration could lead to large errors in the SPMs as a result of the SPMs being calculated from the ratio of two small numbers.
59.
In the new Database, any measured point-wise concentrations less than 0.1% v/v are now included, but they are not used to calculate the SPMs. Concentrations equal to or less than 0.1% v/v are highlighted in bold red text, to clearly indicate that they are low concentrations that are not used in the SPM calculations.
60.
The reason for adding these low measured concentrations into the Database is two-fold. Firstly, the Database user needs to know the location of all the point-wise sensor locations in order to output model predictions at those locations (which are then used to determine the predicted maximum arcwise concentration). Secondly, the data provide useful information for model evaluation, even if they are not used to calculate the SPMs. The precise value of the concentration may be uncertain but if a model predicts a much higher value then this indicates (qualitatively) that there is poor agreement between the model prediction and the measurement. 3.5
WIND-TUNNEL DATA AND SCALING
61.
Wind-tunnel experiments are widely recognized and accepted as a valuable addition to field-scale experiments for model evaluation, if carried out appropriately (Meroney & Neff, 1982; Meroney, 1987). They allow for more control and repeatability of tests, but the effects of heat transfer and atmospheric stability are difficult to replicate in a wind tunnel.
62.
Data from the CHRC, BA-Hamburg and BA-TNO wind tunnel experiments are presented in the Database. These experiments involved only isothermal releases of dense gas in neutral atmospheric stability. Nevertheless, they are still relevant for the evaluation of the key physical processes involved in the dispersion of LNG, namely: the spreading of a dense cloud, the reduction in turbulent mixing due to the density stratification and the influence of the ambient wind field. One of the key reasons for including these experiments in the Database is that they involved the study of various obstacle configurations and sloping terrain.
63.
The data from the wind-tunnel experiments are presented in the Database at both wind-tunnel scale and equivalent field scale. The scaling parameters and scaling relations are provided in each worksheet and are discussed in Sections 9.4 and 10.5 of this report. The scaled data is equivalent to the field-scale release of a substance with the same physical properties as that released in the wind tunnel.
64.
Due to the inherent uncertainties in scaling the wind-tunnel data, it is a requirement that modelers shall simulate the wind-tunnel experiments at wind-tunnel scale if their model is capable of this. This subject was discussed at the UK Explosion Liaison Group meeting in 20122, where experts from DNV, GexCon, Shell and Arkansas University all agreed that the wind-tunnel tests in the Database should be modelled at wind-tunnel scale. However, it is recognized that this may not always be possible, perhaps due to restrictions with the model, and therefore the equivalent field scale data is provided in the Database.
2
http://ukelg.ps.ic.ac.uk/UKELG49.htm, accessed 8 April 2016.
Page 12 of 116
65.
Separate tables of SPMs are calculated in the spreadsheet for wind-tunnel data at wind-tunnel scale and at equivalent field scale. 3.6
STATISTICAL PERFORMANCE MEASURES (SPMS)
66.
SPMs provide a means of comparing aggregated measured and modelled physical comparison parameters. The final two sheets in the Database spreadsheet present tabulated SPMs, including those recommended by Ivings et al. (2007) and those requested by the more recent PHMSA Advisory Bulletin (PHMSA, 2010).
67.
The equations used to calculate the SPMs are presented in Table 3, in which 𝐶𝑚 and 𝐶𝑝 represent the measured and predicted concentrations, respectively; 𝑥𝑚 and 𝑥𝑝 the measured and predicted distance to the given concentration, respectively; 𝑥𝑚,𝐿𝐹𝐿 and 𝑥𝑝,𝐿𝐹𝐿 the measured and predicted distance to the LFL concentration (which is taken as 5% v/v). The angle brackets denote an average over groups of measured and predicted data. Different groups of data are used for these averages, as described in Section 3.6.2 below.
68.
Probably the most straightforward parameters to understand are the Mean Relative Bias (MRB), Mean Relative Square Error (MRSE), Factor of 2 (FAC2), and the Concentration and Distance Safety Factors (CSF and DSF). The Geometric Mean (MG) and Geometric Variance (VG) are less easily understood but have been included because they have been used in several previous model evaluation studies (Hanna et al., 1993b; Duijm et al., 1996; Carissimo et al., 2001; Chang & Hanna, 2004, Hanna et al., 2004) and because they help to ensure that a few very large or very small values of predicted or measured concentrations do not dominate the SPM. Further discussion of the choice of SPMs can be found in the LNG MEP (Ivings et al., 2016). 3.6.1
SPM Calculations in the Database
69.
The SPMs are automatically calculated in the Excel spreadsheet and displayed in tables in the final two worksheets for each of the 33 trials and also for the groups of trials (see Section 3.6.2). The format of these tables has been chosen to match the format used in previous PHMSA Final Decision Letters.
70.
Table 4 shows the first few rows of one of the tables of individual SPMs included in the Version 12 Database. The full table (see Appendix A) includes SPM values for all of the individual trials.
Table 3 Statistical Performance Measures (SPMs) SPM
Mean Relative Bias
Definition
𝑀𝑅𝐵 = ⟨
Notes
𝐶𝑚 − 𝐶𝑝 1 (𝐶 + 𝐶𝑚 ) 2 𝑝
⟩
2
Mean Relative Square Error
𝑀𝑅𝑆𝐸 = ⟨
(𝐶𝑝 − 𝐶𝑚 ) ⟩ 2 1 (𝐶𝑝 + 𝐶𝑚 ) 4
Page 13 of 116
Indicates whether a model over- or underpredicts the measurements on average. Less sensitive than other metrics to minimum concentration thresholds. Gives a symmetric result for under/over prediction. More easily understood than MG. Indicates the degree of scatter (variance) in agreement between predictions and measurements. More easily understood than VG
FAC2: the fraction of predictions within a factor of two of the measurements
𝐶𝑝 0.5 ≤ ( ) ≤ 2.0 𝐶𝑚
Simple metric showing general model performance.
𝐶𝑚 𝑀𝐺 = 𝑒𝑥𝑝 ⟨ ln ( ) ⟩ 𝐶𝑝
Geometric Mean Bias
2
𝐶𝑚 𝑉𝐺 = 𝑒𝑥𝑝 ⟨ [ln ( )] ⟩ 𝐶𝑝
Geometric Variance
𝐶𝑆𝐹 = ⟨
Concentration Safety Factor Concentration Safety Factor to the Lower Flammability Limit (LFL)
𝐶𝑆𝐹𝐿𝐹𝐿
Distance Safety Factor to the Lower Flammability Limit (LFL)
𝐷𝑆𝐹𝐿𝐹𝐿 = ⟨
Indicates the variance relating to MG. Mitigates the effects of extreme values in measured/predicted concentrations. Straightforward metric comparing the predicted and measured concentrations. Straightforward metric comparing the predicted and LFL concentrations at the measured/interpolated distance to the LFL. Straightforward metric comparing the distance to the measured maximum arcwise concentration to the interpolated predicted distance to those concentrations.
𝐶𝑝 = ⟨ ⟩ 𝐿𝐹𝐿
𝐷𝑆𝐹 = ⟨
Distance Safety Factor
𝐶𝑝 ⟩ 𝐶𝑚
Indicates whether a model over- or underpredicts on average whilst also mitigating the effects of extreme values in measured/predicted concentrations.
𝑥𝑝 ⟩ 𝑥𝑚
𝑥𝑝,𝐿𝐹𝐿 ⟩ 𝑥𝑚,𝐿𝐹𝐿
Straightforward metric comparing the predicted to the measured/interpolated distance to the LFL.
Table 4 Partial table from Database v12 of individual trial SPMs3 Table 1: SPM Evaluation against Quantitative Assessment Criteria
0.67 < MG < 1.5
VG < 3.3
0.5 < FAC2
0.5 < CSF < 2
0.5 < CSF_LFL < 2
0.5