D2.5 Product Validation & Algorithm Selection Report ... - Ocean DMI

6 downloads 85 Views 13MB Size Report
Aug 28, 2013 - After this introduction and the list of references, the document is divided ...... in horizontal direction in units of 12.5x12.5 km2 pixels and in time ...
Sea Ice Climate Change Initiative: Phase 1

D2.5 Product Validation & Algorithm Selection Report (PVASR) Sea Ice Concentration Doc Ref: SICCI-PVASR Version: 1.1 Date: 01 May 2013

Consortium Members

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Change Record Issue

Date

Reason for Change

Author

1.0

01 May 2013

First Issue

1.1

28 August 2013

Minor adjustments and clarifications

LTP

Authorship Role

Name

Written by: Checked by:

Signature

Natalia Ivanova, Leif T Pedersen and Rasmus Tonboe Stefan Kern

Approved by: Authorised by:

Distribution Organisation

Names

Contact Details

ESA

Pascal Lecomte

[email protected]

NERSC

Stein Sandven, Lasse H. Pettersson, Natalia Ivanova

[email protected]; [email protected] [email protected]

Logica

Gary Timms, Ed Pechorro

[email protected]; [email protected]

Met.no DMI DTU

Thomas Lavergne, Lars Anders Breivik

[email protected];

Leif Toudal Pedersen, Rasmus Tonboe

[email protected];

Roberto Saldo, René Forsberg, Henning Skriver, Henriette Skourup

[email protected];

[email protected] [email protected] [email protected]; [email protected] [email protected]

FMI

Marko Mäkynen, Eero Rinne, Ari Seina

[email protected]; [email protected]; [email protected]

UCL

[email protected]

University of Hamburg

Stefan Kern

[email protected]

University of Bremen

Georg Heygster

[email protected]

University of Cambridge

Peter Wadhams

[email protected]

MPI-M

Dirk Notz

[email protected]

Ifremer

Fanny Ardhuin

[email protected]

page 2 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Table of Contents 1 1.1 1.2 1.2.1 1.3 1.4 1.5 1.6 1.7 1.8 1.8.1 1.8.2 1.8.3 1.8.4 1.8.5 1.9

Introduction ................................................................................. 17 Purpose and Scope ....................................................................... 17 Document Structure ...................................................................... 17 Algorithm evaluation ................................................................. 17 Document Status .......................................................................... 18 Applicable Documents ................................................................... 18 Applicable Standards ..................................................................... 18 Reference Documents.................................................................... 18 Acronyms and Abbreviations .......................................................... 19 The algorithm intercomparison and selection procedure ...................... 20 Intersensor calibration............................................................... 20 Validation and error bars ........................................................... 20 Discrepancies at high ice concentrations (~100%) ......................... 21 Discrepancies in summer conditions: melt ponds, wet snow and ice. 21 Weather filters ......................................................................... 22 Composite algorithms names .......................................................... 22

2.1 2.1.1 2.1.2 2.2 2.2.1 2.2.2

Evaluation of the algorithms over open water (SIC = 0%) ............ 23 Northern hemisphere open water .................................................... 23 Summer (June- September) ....................................................... 24 Winter (October – May) ............................................................. 27 Southern hemisphere open water .................................................... 31 Summer (December – March)..................................................... 31 Winter (April – November) ......................................................... 34

3.1 3.1.1 3.1.2 3.2 3.2.1 3.2.2

SIC = 15% .................................................................................... 37 Northern hemisphere .................................................................... 37 Summer (June – September) ..................................................... 37 Winter (October – May) ............................................................. 41 Southern hemisphere .................................................................... 44 Summer (December – March)..................................................... 44 Winter (April – November) ......................................................... 48

4.1 4.1.1 4.1.2 4.2 4.2.1 4.2.2 4.3

SIC = 100% .................................................................................. 52 Northern hemisphere .................................................................... 52 Summer (June – September) ..................................................... 52 Winter (October–May) ............................................................... 55 Southern hemisphere .................................................................... 58 Summer (December – March)..................................................... 59 Winter (April – November) ......................................................... 62 FY vs MY ..................................................................................... 65

5.1 5.1.1 5.1.2 5.2 5.2.1

SIC = 85% .................................................................................... 72 Northern hemisphere .................................................................... 72 Summer (June – September) ..................................................... 72 Winter (October–May) ............................................................... 76 Southern hemisphere .................................................................... 81 Winter (April–November) ........................................................... 81

6.1 6.2 6.3

Thin ice ......................................................................................... 86 Collection and visualisation of dataset .............................................. 86 Results for 2010 ........................................................................... 87 TB variability ................................................................................ 93

2

3

4

5

6

page 3 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

7

Issue 0.1 / 01 May 2013

7.1

Melt ponds .................................................................................... 97 Comparison between SIC and open water fraction ............................. 98

8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8 8.8.1 8.8.2

Simulated data ............................................................................ 103 The simulated sea ice concentration .............................................. 103 The simulated data ..................................................................... 103 The microwave emission models ................................................... 104 The snow and sea ice thermodynamic and mass model .................... 105 Simulation procedure .................................................................. 105 The simulated data (Antarctic cases) ............................................. 105 The simulated data (Arctic cases) .................................................. 119 Open Water ............................................................................... 129 The Antarctic open water point at 64°S 280°E ............................ 130 The Arctic open water point at 70°N 0°E .................................... 136

9.1 9.2 9.2.1 9.2.2 9.3

Weather filters ............................................................................ 142 Description of weather filters ........................................................ 142 Results for selected algorithms ..................................................... 143 Summer, SIC = 15% .............................................................. 143 Winter, SIC = 15% ................................................................. 144 Simulated data (15, 20, 25, 30% ice and cut-off) ............................ 146

8

9

10

Atmospheric correction ............................................................... 148 10.1 The effect of atmospheric correction .............................................. 148 10.1.1 RTMs used for atmospheric correction ............................................ 148 10.1.2 Which atmospheric parameters are available .................................. 148 10.2 Reduction in variance of TBs......................................................... 151 10.2.1 SSMI SIC=0............................................................................... 151 10.2.2 SSMI SIC=1............................................................................... 152 10.3 Reduction in variance in SIC (SIC=0) ............................................ 157 10.4 Reduction in variance in SIC (SIC=1) ............................................ 159 10.5 Examples for SIC=0 (Histograms) ................................................. 162 10.5.1 N90Lin ...................................................................................... 162 10.5.2 CP (Comiso Bootstrap Polarisation mode (37HV)) ............................ 163 10.5.3 Bristol ....................................................................................... 164 10.5.4 CF (Comiso Bootstrap frequency mode (19+37V)) ........................... 165 10.5.5 OSISAF (Here also known as OSISAF-2) ........................................ 166 10.6 Integrated retrieval ..................................................................... 167 10.7 Discussion on atmospheric correction ............................................ 169

11

Algorithms and tie-points............................................................ 170 11.1 Algorithms ................................................................................. 170 11.2 Tie-points for normal processing ................................................... 171 11.2.1 Background ............................................................................... 171 11.2.2 Multi-year ice ............................................................................. 174 11.3 Tie-points for atmospheric correction tests ..................................... 175 11.4 Near 90 GHz algorithms ............................................................... 175

12

Instrument drift .......................................................................... 178 12.1 Northern Hemisphere .................................................................. 178 12.1.1 SIC = 0% .................................................................................. 178 12.1.2 SIC = 100% .............................................................................. 180

13

Summary and conclusions ........................................................... 184

14

Acknowledgements ..................................................................... 186 page 4 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

15

Issue 0.1 / 01 May 2013

References .................................................................................. 187

page 5 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

List of Figures Figure 2-1: Standard deviations, SIC = 0%, Northern hemisphere, summer .............. 26 Figure 2-2: Bias, SIC = 0%, Northern hemisphere, summer .................................... 27 Figure 2-3: Standard deviations, SIC = 0%, Northern hemisphere, winter ................. 30 Figure 2-4: Bias, SIC = 0%, Northern hemisphere, winter....................................... 30 Figure 2-5: Standard deviations, SIC = 0%, Southern hemisphere, summer .............. 33 Figure 2-6: Bias, SIC = 0%, Southern hemisphere, summer ................................... 33 Figure 2-7: Standard deviations, SIC = 0%, Southern hemisphere, winter ................ 36 Figure 2-8: Bias, SIC = 0%, Southern hemisphere, winter ...................................... 36 Figure 3-1: Standard deviations, SIC = 15%, Northern hemisphere, summer ............ 40 Figure 3-2: Bias, SIC = 15%, Northern hemisphere, summer .................................. 40 Figure 3-3: Standard deviations, SIC = 15%, Northern hemisphere, winter ............... 43 Figure 3-4: Bias, SIC = 15%, Northern hemisphere, winter ..................................... 44 Figure 3-5: Standard deviations, SIC = 15%, Southern hemisphere, summer ............ 47 Figure 3-6: Bias, SIC = 15%, Southern hemisphere, summer .................................. 47 Figure 3-7: Standard deviations, SIC = 15%, Southern hemisphere, winter ............... 50 Figure 3-8: Bias, SIC = 15%, Southern hemisphere, winter .................................... 50 Figure 4-1: Standard deviations, SIC = 100%, Northern hemisphere, summer ........... 54 Figure 4-2: Bias, SIC = 100%, Northern hemisphere, summer ................................ 55 Figure 4-3: Standard deviations, SIC = 100%, Northern hemisphere, winter ............. 58 Figure 4-4: Bias, SIC = 100%, Northern hemisphere, winter ................................... 58 Figure 4-5: Standard deviations, SIC = 100%, Southern hemisphere, summer .......... 61 Figure 4-6: Bias, SIC = 100%, Southern hemisphere, summer ................................ 61 Figure 4-7: Standard deviations, SIC = 100%, Southern hemisphere, winter ............. 64 Figure 4-8: Bias, SIC = 100%, Southern hemisphere, winter ................................... 65 Figure 4-9: Standard deviation MYI vs FYI, SIC = 100%, Northern hemisphere, summer, SSM/I ......................................................................................... 69 Figure 4-10: Standard deviation MYI vs FYI, SIC = 100%, Northern hemisphere, winter, SSM/I ....................................................................................................... 70 page 6 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 4-11: Standard deviation MYI vs FYI, SIC = 100%, Southern hemisphere, summer, SSM/I ......................................................................................... 70 Figure 4-12: Standard deviation MYI vs FYI, SIC = 100%, Southern hemisphere, winter, SSM/I ....................................................................................................... 71 Figure 5-1: Standard deviations, SIC = 85%, Northern hemisphere, summer ............ 75 Figure 5-2: Bias, SIC = 85%, Northern hemisphere, summer .................................. 75 Figure 5-3: Histograms of SIC for SSMI Northern hemisphere SIC85 data (Summer and Winter combined). The orange bar marks SIC=85%. ....................................... 78 Figure 5-4: Standard deviations, SIC = 85%, Northern hemisphere, winter ............... 79 Figure 5-5: Bias, SIC = 85%, Northern hemisphere, winter ..................................... 79 Figure 5-6: Bias, SIC = 85%, Northern hemisphere, winter. Examples of correlation between SSMI SIC85 derived by different algorithms. Note the saturation of the NT2 algorithm and the ASI algorithm. ........................................................... 80 Figure 5-7: Standard deviations, SIC = 85%, Southern hemisphere, winter ............... 83 Figure 5-8: Bias, SIC = 85%, Southern hemisphere, winter .................................... 84 Figure 5-9: NT2 atmosphere, SIC = 85%, Northern hemisphere, all year. ................. 85 Figure 5-10: NT2 atmosphere, SIC = 100%, Northern hemisphere, all year. .............. 85 Figure 6-1: Left: Example map of identified thin ice regions. Right: histogram of thicknesses distribution for the included data points ........................................ 87 Figure 6-2: Plot of retrieved SIC vs SMOS ice thickness including most of the tested algorithms................................................................................................. 88 Figure 6-3: Plot of retrieved SIC vs SMOS ice thickness for selected algorithms for clearer view............................................................................................... 88 Figure 6-4: Plot of retrieved SIC vs SMOS ice thickness for another selection of algorithms................................................................................................. 89 Figure 6-5: Number of SMOS data points per thickness category (1 cm). Above 35 cm the number of data points is small ................................................................ 89 Figure 6-6: Algorithm estimation of SIC for 100% 5cm thick ice............................... 90 Figure 6-7: Algorithm estimation of SIC for 100% 10 cm thick ice ............................ 90 Figure 6-8: Algorithms estimation of SIC for 100% 15 cm thick ice .......................... 91 Figure 6-9: Algorithms estimation of SIC for 100% 20 cm thick ice .......................... 92 Figure 6-10: Algorithms estimation of SIC for 100% 25 cm thick ice ......................... 92

page 7 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 6-11: Algorithm estimation of SIC for 100% ice at various thicknesses. The 35 cm data should be used with caution due to the limited number of datapoints at this thickness. ........................................................................................... 93 Figure 6-12: AMSR vertically polarized TBs as a function of SMOS ice thickness for our validation dataset. ...................................................................................... 94 Figure 6-13: AMSR horizontally polarized TBs as a function of SMOS ice thickness for our validation dataset. ................................................................................ 94 Figure 6-14: AMSR Polarization ratio (PR) as a function of SMOS ice thickness ........... 95 Figure 6-15: NASA-Team multiyear ice concentration as a function of SMOS ice thickness .................................................................................................. 95 Figure 7-1: Melt pond fraction throughout the dataset. X-axis goes from June 1 to August 31. July 1 is at 5310, August 1 at 8122, so very few points from August. . 98 Figure 7-2: SIC derived from selected algorithms as a function of C computed according to (Eq 7.1) ................................................................................................ 99 Figure 7-3: SIC derived from selected algorithms as a function of C computed according to (Eq 7.1) ................................................................................................ 99 Figure 7-4: SIC derived from selected algorithms as a function of C computed according to (Eq 7.1) .............................................................................................. 100 Figure 7-5: SIC derived from selected algorithms as a function of C computed according to (Eq 7.1) .............................................................................................. 100 Figure 8-1: The simulated snow and ice profile in the Ross Sea (75°S, 200°E) ......... 108 Figure 8-2: The simulated Ross Sea ice concentration .......................................... 109 Figure 8-3: The 9 different ice concentration algorithms sensitivity to the snow ice interface temperature in the Ross Sea profile (see figure 8-1). ........................ 110 Figure 8-4: Sensitivity of 9 sea ice concentration algorithms to cloud liquid water at the Ross Sea ice simulation ............................................................................. 111 Figure 8-5: The sensitivity of 9 algorithms to snow depth at the Ross Sea ice profile . 112 Figure 8-6: The sensitivity of 9 sea ice concentration algorithms to the snow temperature gradient in the Ross Sea ice profile ........................................... 113 Figure 8-7: The sensitivity of 9 sea ice concentration algorithms to the snow surface density in the Ross Sea ice profile ............................................................... 114 Figure 8-8: The sensitivity of 9 sea ice concentration algorithms to atmospheric water vapor in the Ross Sea ice profile ................................................................. 115 Figure 8-9: The sensitivity of 9 sea ice concentration algorithms to the average snow correlation length in the Ross Sea ice profile ................................................ 116 Figure 8-10: The sensitivity of the 9 sea ice concentration algorithms to the snow surface temperature in the Ross Sea ice profile ............................................ 117 page 8 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 8-11: Snow and ice profile in the Lincoln Sea multiyear ice .......................... 119 Figure 8-12: The sea ice concentration for the Lincoln Sea multiyear ice profile ........ 120 Figure 8-13: The snow - ice interface temperature in the Lincoln Sea multiyear ice profile .................................................................................................... 121 Figure 8-14: The sea ice concentration estimate from 9 different algorithms sensitivity to snow surface temperature in the Lincoln Sea multiyear ice profile ................ 122 Figure 8-15: The 9 sea ice concentration algorithms sensitivity to cloud liquid water in the Lincoln Sea multiyear ice profile ............................................................ 123 Figure 8-16: The 9 sea ice concentration algorithms sensitivity to snow depth in the Lincoln Sea multiyear ice profile ................................................................. 124 Figure 8-17: The 9 sea ice concentration algorithms sensitivity to snow temperature gradient in the Lincoln Sea multiyear ice profile ............................................ 125 Figure 8-18: The 9 sea ice concentration algorithms sensitivity to snow surface density in the Lincoln Sea multiyear ice .................................................................. 126 Figure 8-19: The 9 sea ice concentration algorithms sensitivity to atmospheric water vapor in the Lincoln Sea multiyear ice profile ................................................ 127 Figure 8-20: The 9 sea ice concentration algorithms sensitivity to average snow correlation length in the Lincoln Sea multiyear ice profile ............................... 128 Figure 8-21: The simulated sensitivity of the 9 sea ice concentration algorithms to cloud liquid water at 64°S 280°E over open water ................................................. 133 Figure 8-22: The simulated sensitivity of the 9 sea ice concentration algorithms to sea surface temperature at 64°S 280°E over open water ..................................... 134 Figure 8-23: The simulated sensitivity of the 9 sea ice concentration algorithms to atmospheric water vapor at 64°S 280°E over open water ............................... 135 Figure 8-24: The simulated sensitivity of the 9 sea ice concentration algorithms to surface wind at 64°S 280°E over open water................................................ 136 Figure 8-25: The simulated sensitivity of the 9 sea ice concentration algorithms to cloud liquid water at 70°N 0°E over open water .................................................... 138 Figure 8-26: The simulated sensitivity of the 9 sea ice concentration algorithms to sea surface temperature at 70°N 0°E over open water ........................................ 139 Figure 8-27: The simulated sensitivity of the 9 sea ice concentration algorithms to atmospheric water vapor at 70°N 0°E over open water .................................. 140 Figure 8-28: The simulated sensitivity of the 9 sea ice concentration algorithms to wind at 70°N 0°E over open water ..................................................................... 141 Figure 9-1: Illustration of weather filter performance with AMSR-E data from 2008 Northern hemisphere. X-axis is GR1, and y-axix is GR2 from equation 9-2 ....... 147 Figure 10-1: Example histogram of WaterVapour (SIC=0, 2008, AMSR) .................. 149 page 9 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 10-2: Example histogram of wind speed (SIC=0, 2008, AMSR) .................... 149 Figure 10-3: Example histogram of cloud liquid water (SIC=0, 2008, AMSR) ........... 150 Figure 10-4: Example histogram of T2m (SIC=0, 2008, AMSR) .............................. 151 Figure 10-5: Example of scatter plot of T2m vs Ts. Marker size is proportional to WV. (ERA Interim, 2008, SIC=0, AMSR) ............................................................ 151 Figure 10-6: TB36V before atmospheric correction ............................................... 153 Figure 10-7: TB36V after atmospheric correction ................................................. 154 Figure 10-8: TB36H before correction ................................................................ 154 Figure 10-9: TB36H after correction ................................................................... 155 Figure 10-10: TB89V before correction ............................................................... 156 Figure 10-11: TB89V after correction ................................................................. 156 Figure 10-12: Histogram SIC=0, Near 90 GHz Lin dyn. Note that the histogram has been truncated at -100 and +100 ............................................................... 162 Figure 10-13: Histogram after RTM SIC=0, Near 90 GHz Lin dyn ........................... 162 Figure 10-14: Histogram SIC=0, Bootstrap P ...................................................... 163 Figure 10-15: Histogram after RTM SIC=0, Bootstrap P ........................................ 163 Figure 10-16: Histogram SIC=0, Bristol ............................................................. 164 Figure 10-17: Histogram after RTM SIC=0, Bristol ............................................... 164 Figure 10-18: Histogram SIC=0, Bootstrap F ...................................................... 165 Figure 10-19: Histogram after RTM SIC=0, Bootstrap F ........................................ 165 Figure 10-20: Histogram SIC=0, OSISAF-2 ......................................................... 166 Figure 10-21: Histogram after RTM SIC=0, OSISAF-2 .......................................... 166 Figure 10-22: Histogram of SIC=0 retrieval using , Bootstrap F ............................. 168 Figure 10-23: Histogram integrated retrievalof SIC=0. Same X-axis (-10% to 10%) as figure 10-24 but given in fractions rather than %. ......................................... 168 Figure 11-1: Code for computing atmospheric corrected tie-points for NORSEX algorithm. The same atmospheric opacity are used for SMMR, SSMI and AMSR . 171 Figure 11-2: Relationship between P90 (n90V-n90H) and Sea Ice Concentration ...... 176 Figure 11-3: Relationship between P85 (85V-85H) and Sea Ice Concentration near SIC0 for the 4 near90 algorithms in the form they were tested. .............................. 176

page 10 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 11-4: Scatterplot of P90 before atmospheric correction (x-axis) vs P90 after correction (y-axis). ................................................................................... 177 Figure 12-1: SIC = 0%, Northern Hemisphere, average winter sea ice concentrations from selected algorithms. SMMR, SSM/I, AMSR ............................................ 178 Figure 12-2: SIC = 0%, Northern Hemisphere, average winter brightness temperatures for selected channels. SMMR, SSM/I, AMSR.................................................. 179 Figure 12-3: SIC = 100%, Northern Hemisphere, average winter sea ice concentrations from selected algorithms. SSM/I, AMSR ....................................................... 180 Figure 12-4: SIC = 100%, Northern Hemisphere, average winter brightness temperatures for selected channels. SSM/I, AMSR ........................................ 181 Figure 12-5: SIC from 2008 SIC1 SSMI dataset before ATM correction. X-axis is sample number. Most points are in the Winter-spring. July 1 is 4592. October 1 is 4896.182

page 11 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

List of Tables Table 1-1: Applicable Documents ........................................................................ 18 Table 1-2: Applicable Standards .......................................................................... 18 Table 1-3: Reference Documents ......................................................................... 19 Table 1-4: Acronyms ......................................................................................... 20 Table 1-5: Composite algorithms names ............................................................... 22 Table 2-1: SIC = 0%, Northern Hemisphere, summer ............................................ 25 Table 2-2: SIC = 0%, Northern Hemisphere, summer. Average over all the instruments present for given algorithm.......................................................................... 26 Table 2-3: SIC = 0%, Northern Hemisphere, winter ............................................... 28 Table 2-4: SIC = 0%, Northern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 29 Table 2-5: SIC = 0%, Southern Hemisphere, summer ............................................ 32 Table 2-6: SIC = 0%, Southern Hemisphere, summer. Average over all the instruments present for given algorithm.......................................................................... 33 Table 2-7: SIC = 0%, Southern Hemisphere, winter ............................................... 35 Table 2-8: SIC = 0%, Southern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 35 Table 3-1: SIC = 15%, Northern Hemisphere, summer ........................................... 38 Table 3-2: SIC = 15%, Northern Hemisphere, summer. Average over all the instruments present for given algorithm ........................................................ 39 Table 3-3: SIC = 15%, Northern Hemisphere, winter ............................................. 42 Table 3-4: SIC = 15%, Northern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 43 Table 3-5: SIC = 15%, Southern Hemisphere, summer .......................................... 45 Table 3-6: SIC = 15%, Southern Hemisphere, summer. Average over all the instruments present for given algorithm ........................................................ 46 Table 3-7: SIC = 15%, Southern Hemisphere, winter ............................................. 48 Table 3-8: SIC = 15%, Southern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 49 Table 4-1: SIC = 100%, Northern Hemisphere, summer ......................................... 53

page 12 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Table 4-2: SIC = 100%, Northern Hemisphere, summer. Average over all the instruments present for given algorithm ........................................................ 54 Table 4-3: SIC = 100%, Northern Hemisphere, winter............................................ 56 Table 4-4: SIC = 100%, Northern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 57 Table 4-5: SIC = 100%, Southern Hemisphere, summer ........................................ 59 Table 4-6: SIC = 100%, Southern Hemisphere, summer. Average over all the instruments present for given algorithm ........................................................ 60 Table 4-7: SIC = 100%, Southern Hemisphere, winter ........................................... 63 Table 4-8: SIC = 100%, Southern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 64 Table 4-9: SIC = 100%, Northern Hemisphere, summer, MYI vs FYI ........................ 66 Table 4-10: SIC = 100%, Northern Hemisphere, winter, MYI vs FYI ......................... 67 Table 4-11: SIC = 100%, Southern Hemisphere, summer, MYI vs FYI ...................... 68 Table 4-12: SIC = 100%, Southern Hemisphere, winter, MYI vs FYI ......................... 69 Table 5-1: SIC = 85%, Northern Hemisphere, summer ........................................... 73 Table 5-2: SIC = 85%, Northern Hemisphere, summer. Average over all the instruments present for given algorithm ........................................................ 74 Table 5-3: SIC = 85%, Northern Hemisphere, winter ............................................. 77 Table 5-4: SIC = 85%, Northern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 77 Table 5-5: SIC = 85%, Southern Hemisphere, winter ............................................. 82 Table 5-6: SIC = 85%, Southern Hemisphere, winter. Average over all the instruments present for given algorithm.......................................................................... 83 Table 7-1: Average values and stdevs for MODIS derived SIC and MPF and for AMSR TBs ........................................................................................................ 101 Table 7-2: Calculated SIC and their stdevs for the tested algorithms. Upper row in red are MODIS derived reference values. The bottom row is multi-year ice concentration as derived with the Nasa Team algorithm ................................. 102 Table 8-1: Categorization of the 9 selected algorithms. The polarization algorithms are using the polarization difference or ratio. The gradient algorithms are using the spectral gradient e.g. at Tb19v and Tb37v. The “hybrid” refers to a combination of polarization and gradient. Here low frequency is 6 GHz and high frequency is near 90 GHz. ESMR is the single channel (Tb19h) radiometer on NIMBUS 5 ............. 106 Table 8-2: The Ross Sea correlation matrix. The snow surface density: Dens, The average snow correlation length: appc, The snow surface temperature: Ti, The page 13 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

snow ice interface temperature: ist, The snow depth: St, the snow temperature gradient: snowg, The atmospheric water vapor: Vapor, The cloud liquid water: Liquid. .................................................................................................... 107 Table 8-3: mean ice concentration, the standard deviation with or without atmosphere and the mean atmosphere/ no atmosphere ice concentration ratio for the 9 algorithms in the Ross Sea ice profile .......................................................... 117 Table 8-4: The partial correlation of each of the 6 parameters (the snow ice interface temperature, the snow surface temperature, the snow depth, the snow surface density, atmospheric water vapor, cloud liquid water) and the ice concentration with the effects of the other physical parameters (snow surface density, the average snow correlation length, snow surface temperature, the snow ice interface temperature, snow depth, snow temperature gradient, atmospheric water vapor, cloud liquid water, excluding one of the 6) removed ...................................... 118 Table 8-5: Shows the mean ice concentration, the standard deviation with or without atmosphere and the mean atmosphere/ no atmosphere ice concentration ratio for the 9 algorithms in the Lincoln Sea ice profile. The “no atmosphere” case is the surface emission only. The “atmosphere included” is the atmospheric emission absorption and reflection computed with a modified Wentz model in addition to the surface emission. This includes oxygen absorption ........................................ 129 Table 8-6: Partial correlations: The partial correlation of each of the 6 parameters (snow ice interface temperature, surface temperature, snow depth, snow surface density, atmospheric water vapor, cloud liquid water) and the ice concentration with the effects of the other physical parameters (snow surface density, snow correlation length, snow surface temperature, snow ice interface temperature, snow depth, snow temperature gradient, atmospheric water vapor, cloud liquid water, excluding one of the 6) removed .............................................................................. 129 Table 8-7: The 64°S 280°E sample MEAN and STDEV ........................................... 130 Table 8-8: The Bellinghausen Sea open water correlation matrix, r ......................... 131 Table 8-9: The partial correlation, r, of each of the 4 input parameters to the Wentz model (The surface wind speed, atmospheric water vapor, the cloud liquid water, and the sea surface temperature) and the ice concentration with the effects of the other physical parameters removed (the one in question excluded) ................. 132 Table 8-10: The partial correlation, r, of each of the 4 input parameters to the Wentz model (The surface wind speed, atmospheric water vapor, the cloud liquid water, and the sea surface temperature) and the ice concentration with the effects of the other physical parameters removed (the one in question excluded) ................. 136 Table 8-11: Correlation matrix at 70°N 0°E, r, with the four input parameters to the Wentz model and the ice concentration from selected algorithms .................... 137 Table 8-12: The mean, MEAN, and standard deviation, STDEV, of the simulated data at 70°N 0°E ................................................................................................ 138 Table 9-1: SIC = 15%, summer. Standard deviations of concentrations with and without weather filters ......................................................................................... 143 Table 9-2: SIC = 15%, summer. Average concentrations with and without weather filters ..................................................................................................... 144 page 14 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Table 9-3: SIC = 15%, winter. Standard deviations of concentrations with and without weather filters ......................................................................................... 145 Table 9-4: SIC = 15%, winter. Average concentrations with and without weather filters146 Table 10-1: Reduction of TB variance by correction for various atmospheric terms from ERA Interim co-located data. A total of 6621 datapoints from both hemispheres and all seasons were used in the assessment. Only SIC=0 ............................. 152 Table 10-2: Reduction of TB variance by correction for various Teff terms from ERA Interim co-located data. A total of 1220 data-points from both hemispheres and all seasons were used in the assessment ......................................................... 153 Table 10-3: Average SIC and standard deviation of SIC before (orange) and after (green) atmospheric correction of TBs. Open water cases. Results are computed with theoriginal Tie-Points without atm correction for all cases which explains the substantial biases of many algorithms. ........................................................ 157 Table 10-4: SIC=0 standard deviations before and after atmospheric correction. Atmospheric corrected TPs used for atmospheric corrected data. AMSR, Northern hemisphere. ............................................................................................ 158 Table 10-5: SIC=0 average and standard deviations before (red) and after (green) atmospheric correction of TBs. Atmospheric corrected Tie-points used for atmospheric corrected analysis. The table summarises results from northern and southern hemispheres and for summer and winter. Near90 results are suspected to be erroneous. .......................................................................................... 159 Table 10-6: Average SIC and standard deviation of SIC before (orange) and after (green) atmospheric correction of TBs. SIC1 cases ........................................ 160 Table 10-7: Results of atmospheric correction for SIC=1 and split in FY and MY. MY is 15% datapoints with lowest 36H and FY is 15% with highest 36H. Northern hemisphere only. ..................................................................................... 161 Table 10-8: Average and standard deviation of retrieval of SIC and a number of other parameters using an integrated retrieval method. ......................................... 167 Table 11-1: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm .............................. 172 Table 11-2: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm .............................. 172 Table 11-3: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice ......................................................................................... 173 Table 11-4: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice ......................................................................................... 173 Table 11-5: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY page 15 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice ......................................................................................... 174 Table 11-6: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice ......................................................................................... 174 Table 11-7: Tie-points SIC=0 used for testing algorithm performance with atmospheric corrected TBs. Same TPs are used for N and S. ............................................ 175

page 16 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

1

Introduction

1.1

Purpose and Scope This document, the product validation and algorithm selection report or the PVASR, is describing the analysis of the different algorithms using the round robin test data. It is describing the criteria’s for selection and the results of the evaluation in the algorithm omnium. The algorithm which is then selected is described in the algorithm theoretical basis document: the ATBD. The ATBD also includes a more detailed description of all the algorithms tested, including the python computer code.

1.2

Document Structure After this introduction and the list of references, the document is divided into a number of chapters dealing with each part of the algorithm validation.

1.2.1

Algorithm evaluation Algorithms will be evaluated independently for each identified source of error/uncertainty. Eventually we intend to fine-tune the selected algorithm(s) with dynamic tie-points in order to minimize the effect of sensor drift and inter sensor differences. In their published form the algorithms were tuned to a specific version of the source microwave radiometer data. We will be using different sources of data demanding a new calibration (tie-point tuning) of the algorithms anyway. As a consequence, in algorithm evaluation we will be less interested in eventual biases relative to the correct ice concentrations, and more interested in RMS errors around the expected values. Algorithms will be tested against the following sources of errors:

1.2.2

1.2.3

Sensitivity to atmosphere •

Open water dataset selected in regional seas around the Arctic and Antarctic.



Simulated data using the Wentz forward models



Met/ocean data screening

Sensitivity to emissivity variations •

100% ice dataset



Simulated data



Snow/ice/atmosphere data screening



Summer winter sensitivity differences

In addition the algorithms will be compared and evaluated concerning the following characteristics: page 17 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013



Summer performance (melt ponds etc.)



SMMR vs. SSMI vs AMSR performance



Thin ice performance



Potential obtainable spatial resolution vs. uncertainty



Potential time period e.g. 10, 20 or 30 years



Hemisphere differences (North and South)



1.3

Summer winter sensitivity differences

Document Status This is a first issue release to ESA as part of the project’s contractual deliverable set.

1.4

Applicable Documents The following table lists the Applicable Documents that have a direct impact on the contents of this document. Acronym AD-1

Title

Reference

Issue

Sea Ice ECV Project Management Plan

ESA-CCI_SICCI_PMP_D6.1_v1.1

1.1

Table 1-1: Applicable Documents

1.5

Applicable Standards Acronym

Title

Reference

Issue

Table 1-2: Applicable Standards

1.6

Reference Documents Acronym

Title

Reference

Issue

URD-1

Sea Ice ECV User Requirement Survey

1.0

ATBDv0

Sea Ice ECV Algorithm Theoretical Basis Document

1.1

page 18 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Acronym

Title

Reference

Issue

CECR_01

Sea Ice ECV Comprehensive Error Characterisation Report

1.1

PVP

Sea Ice ECV Product Validation Protocol

1.1

ATBDv1

Sea Ice ECV Algorithm Theoretical Basis Document

1.0

Table 1-3: Reference Documents

1.7

Acronyms and Abbreviations Acronym

Meaning

AMSR-E

Advanced Microwave Scanning Radiometer aboard EOS

AO

Announcement of Opportunity

ASCII

American Standard Code for Information Interchange

ASI_NWFRAS

Airborne Synthetic Aperture and Interferometric Radar Altimeter System

CM-SAF

Climate Monitoring Satellite Application Facility

DMSP

Defence Meteorological Satellite Program

DWD

Deutscher Wetterdienst

ECV

Essential Climate Variable

Envisat

Environmental Satellite

ESA

European Space Agency

EUMETSAT

European Organisation for the Exploitation of Meteorological Satellites

FCDR

Fundamental Climate Data Record

FOC

Free of Charge

FOV

Field-of-View

FTP

File Transfer Protocol

GB

GigaByte

GCOM

Global Change Observation Mission

H

Horizontal polarization

H+V

Horizontal and vertical polarization

MB

MegaByte

MODIS

Moderate Resolution Imaging Spectroradiometer

n.a.

Not applicable

NetCDF

Network Common Data Format

NSIDC

National Snow and Ice Data Center

OIB

Operation Ice Bridge

OSI-SAF

Ocean and Sea Ice Satellite Application Facility

PI

Principal Investigator

PMW

Passive Microwave

POES

Polar Operational Environmental Satellite

page 19 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Acronym

Meaning

PRF

Pulse Repetition Frequency

RADAR

Radio Detection and Ranging

SAR

Synthetic Aperture Radar

SIC

Sea Ice Concentration

SIRAL

SAR/Interferometric Radar Altimeter

SIT

Sea Ice Thickness

SMMR

Satellite Multichannel Microwave Radiometer

SSM/I

Special Sensor Microwave / Imager

SSM/IS

Special Sensor Microwave / Imager+Sounder

TB

TeraByte

t.b.d.

To be determined

TM

Thematic Mapper

ULS

Upward Looking Sonar

URL

Uniform Resource Locator

V

Vertical polarization

Table 1-4: Acronyms

1.8

The algorithm intercomparison and selection procedure

1.8.1

Intersensor calibration Sea ice concentration observations will be intercalibrated among observations from the Scanning Multichannel Microwave Radiometer (SMMR) (1978-1987), various Special Sensor Microwave/Imager (SSMI) instruments (1987-present) and the Advanced Microwave Scanning Radiometer (AMSR) (2003-2011) (see DARD: IDs 1.01 to 1.03). Tie-points are typical signatures of 100% ice and open water which are used in the ice concentration algorithms as a reference. The tie-points are derived by selecting brightness temperatures from regions of known open water and 100% ice. Usually these tie-points are static in time and space, but they can be adjusted to follow the seasonally changing signatures of ice and open water (see e.g. Kern and Heygster, 2001) as it is currently done, for instance, in the operational OSISAF ice concentration processing [REF]. Static tie-points are prone to be affected by sensor drift, inter sensor calibration differences and climatic trends in surface and atmospheric emission. The data must therefore be carefully calibrated before computing the ice concentrations. Here we will investigate the use of dynamic tiepoints, a method that minimizes these unwanted effects, with or without prior calibration of the passive microwave data. Such dynamic tie points will substantially facilitate the combination of data from different sensors.

1.8.2

Validation and error bars The complete transfer of uncertainty will be investigated in this project, from L1b swath-based brightness temperatures to daily gridded composite. The sensitivity from various sea ice concentration algorithms will be evaluated, with respect to various factors such as atmospheric noise, surface emissivity uncertainty, noise in the observed brightness page 20 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

temperatures, etc. A research effort will be conducted on the gridding and projection algorithms, and will result in a quantitative transfer of uncertainty associated to this step, depending on the FoV size and shape of the instruments. Error bars will be assessed both theoretically and empirically as their consistency is checked against results from an extensive validation exercise. 1.8.3

Discrepancies at high ice concentrations (~100%) During winter, in the consolidated ice, well within the ice edge, the ice concentration is very near 100% [Andersen et al., 2007]. This has been established using high resolution SAR data, ship observations and by comparing the estimates from different ice concentration algorithms. The fluxes between the ocean/ice and atmosphere are sensitive to small variations in these ranges of sea ice concentration and thereby these discrepancies are of large importance for coupled climate models. The apparent fluctuations in the derived ice concentration in the near 100% ice regime are primarily attributed to snow/ice surface emissivity variability around the tie-point signature and only secondarily to actual ice concentration fluctuations [Kwok, 2002]. In the marginal ice zone the atmospheric extinction may be significant. The fluctuations due to atmospheric and surface emission are systematic. In fact, different algorithms with different sensitivity to atmospheric extinction and surface emission compute quite different trends in sea ice area and extent on seasonal and decadal time scales [Andersen et al., 2007]. This means that not only does the sea ice area have a climatic trend, but the atmospheric and surface constituents affecting the microwave emission are also changing. For example, different wind patterns, water vapour and liquid water concentrations in the atmosphere, snow depth, fraction of perennial ice etc. The present project will pay particular attention to retrieval close to 100% sea ice concentration by 1) including 100% ice covered areas in the test/validation datasets, 2) detecting these 100% ice covered situations by SAR and high-resolution optical data, 3) supplementing the detection of these situations using deformation information processed from a SAR sea ice motion dataset, and 4) studying the sensitivity of PMW algorithms to variations surface emissivity (ice-type, snow, etc.) using model simulations.

1.8.4

Discrepancies in summer conditions: melt ponds, wet snow and ice. Reflectance from MODIS channels 1, 3 and 4 are used to derive the meltpond cover fraction and a summer-time ice concentration estimate. The melt pond cover fraction is determined using a classification which follows a mixed-pixel approach. It is assumed that the reflectance measured over each MODIS 500 m grid cell comprises contributions from three surface types: melt ponds, open water, sea ice/snow [Roesel et al., 2012a]. By using known reflectance values [e.g. Tschudi et al., 2008] a neural network is built, trained, and applied [Roesel et al., 2012a]. The resulting surface type class distributions are saved and made available as 12.5 km grid resolution product (DARD ID: 2.08); melt pond cover fractions are adjusted to the fraction of the sea ice cover per pixel, i.e. the melt pond cover fraction is not given relative to the area of the grid cell but given relative to the sea ice area of that grid cell.

page 21 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

1.8.5

Issue 0.1 / 01 May 2013

Weather filters Many algorithms deploy weather filters to avoid detecting ice in open water. The brightness temperature noise is generated by wind roughening of the ocean surface, by water (vapour or liquid) in the atmosphere or by precipitation. We will as far as possible test the algorithms both with and without these weather filters, and we may decide to use a set of algorithms with weather filters published with non-selected algorithms.

1.9

Composite algorithms names The OSI-SAF sea ice concentration data set is based on a combination of different algorithm. In addition to original algorithms we will also test performance of this combination as well as other composite algorithms. The short names of these composite algorithms will be used in the figures of the document for convenience. Short name

Description

Combo1

(NASA Team + Bootstrap_F)/2

Combo2

(NASA Team + Bootstrap_F + Near90GHz lin dyn)/3

Combo3

(P37 + Near 90GHz lin dyn)/2

Combo4

(P37 + Near 90GHz lin dyn + Bootstrap_F)/3

Combo5

(Bootstrap_F +( Bootstrap_F2)* Near 90GHz lin dyn)/(1+ Bootstrap_F2)

Combo6

(Bootstrap_F +( Bootstrap_F 3)* Near 90GHz lin dyn)/(1+ Bootstrap_F3)

Combo7

(Bootstrap_F + Near 90GHz lin dyn)/2

Combo8

(Bootstrap_F + Bootstrap_F * Near 90GHz lin dyn)/(1+ Bootstrap_F)

Table 1-5: Composite algorithms names

page 22 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

2 Evaluation of the algorithms over open water (SIC = 0%) In principle every algorithm should be evaluated over open water, at intermediate concentrations and at near 100% ice cover. In practise it is very difficult to find reference data at intermediate concentrations especially for large areas covering entire satellite footprints (70km) and covering all seasons and ice types. These issues compromise the meaningfulness of the evaluation at intermediate concentrations. Therefore the evaluation is carried out only for the SIC = 0% and SIC = 100% cases. These are the cases where we can get the most high quality reference data. For some algorithms it was necessary to use SIC = 15% and SIC = 85% instead of 0% and 100%. The reason for this will be explained in section 2.1. The input data to the algorithms are sets of brightness temperatures corresponding to sea ice concentrations equal 0%. Ice charts were used in order to identify such zones. The data are provided from the instruments: SMMR, SSM/I and AMSR for Northern and Southern hemispheres. The Northern hemisphere summer is defined as months from June to September, and winter as October–May. The Southern hemisphere summer is defined as months from December to March, and winter is April through November. In this and the following 3 sections (SIC = 15%, SIC = 100%, SIC = 85%) the algorithms are run without weather filters applied. The weather filters study is presented in the section 9. The algorithms are characterized by standard deviation and bias from the validation dataset. The bias is defined as difference from the given concentration value of the validation dataset (SIC = 0%, 15%, 85%, 100%). The ASI algorithm is called ASI_NWF to mark the version of the algorithm used here (NWF stands for “no weather filter”).

2.1

Northern hemisphere open water The SMMR, SSM/I and AMSR data were collected at open water sites on the Northern Hemisphere at a safe distance but not too far from the ice edge. There are different sites for summer and winter to follow the seasonal variation of the ice edge. The sites are in the Atlantic and in the Pacific. We know that the ice concentration over open water is always zero and all variability and standard deviation in the data produced by the different algorithms is a quantification of the sensitivity to noise. The noise may be from the radiometer instrument, from wind induced surface roughness, from surface and atmospheric temperature variability and from atmospheric water vapour and clouds. Normally a bias in the ice concentration over open water is a question of adjusting the tie-points. We use a standard set of tie-points described in section 11 and a bias is not necessarily an indication of poor algorithm performance. However, some algorithms have a special way of estimating tie-points and a non-linear way of dealing with ice concentrations near 0% page 23 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

(and at 100%). This makes it impossible to compare these algorithms directly to other algorithms because the standard deviation is affected by the treatment at 0% and 100% reference points. Therefore, we have artificially produced reference datasets at 15% and at 85% for evaluating these special algorithms at these points instead. A potential bias at these intermediate reference points may indicate a bias at intermediate concentrations in general. 2.1.1

Summer (June- September) Table 2-1 is showing the standard deviation and average ice concentration at the open water reference sites for all 3 sensors separately and all algorithms during summer (June-September). The gaps in the table are when a given algorithm is using near 90GHz channels which is not on SMMR and 6GHz which is not on SSM/I. In the tables throughout the document we will use the full descriptions of the composite algorithms to give better insight, however, in the figures the short names listed in the Table 1-5 will be used for practical reasons. The SMMR has in general the lowest standard deviation. This may be due to the 18GHz channel being further away from the water absorption line at near 22GHz than the other two sensors. The average and standard deviation for all the three sensors are combined in Table 2-2 for clarity. All algorithms using near 90GHz channels have very high standard deviations. The Bootstrap F and other similar channel combinations has the lowest standard deviation. Algorithm

Standard deviation AMSR

SSM/I

Near 90GHz lin, dyn

34.95

Near 90GHz

AMSR

SSM/I

36.98

27.99

12.90

34.72

33.46

31.29

19.09

ASI_NWF

33.75

31.02

37.69

42.26

P90

34.19

32.83

26.69

29.48

P37

18.17

16.54

15.16

1.23

1.03

1.82

Bootstrap P

18.19

16.70

15.18

1.25

1.12

1.84

P18

10.38

10.60

8.03

1.55

1.85

-0.45

Bristol

8.87

8.74

6.42

1.85

2.15

0.09

PR

8.74

8.23

6.71

2.24

1.80

1.37

NASA Team

6.89

7.66

4.71

1.69

2.28

-0.81

NORSEX

4.34

5.54

3.54

2.33

2.86

-0.81

Bootstrap F

4.34

5.53

3.51

2.20

2.71

-0.75

CalVal

4.34

5.53

3.51

2.20

2.71

-0.75

UMass-AES

4.34

5.53

3.51

2.20

2.71

-0.75

P10

5.08

5.08

-4.42

-3.97

One channel (6H)

2.24

2.84

-2.33

1.18

TUD

4.34

5.53

(NT+BF)/2

5.47

6.45

14.74

16.05

(NT+BF+N90lin_dyn)/3

SMMR

Average

3.92

2.20

2.71

1.94

2.50

10.62

5.96

SMMR

-0.78

page 24 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Algorithm

Standard deviation

(P37+N90lin_dyn)/2 (P37+N90lin_dyn+BF)/3

AMSR

SSM/I

26.00

SMMR

Average AMSR

SSM/I

26.30

14.61

6.97

SMMR

18.38

18.89

10.47

5.55

2

2

4.71

5.88

2.33

2.85

3

3

4.42

5.60

2.22

2.74

19.03

20.56

15.09

7.81

(BF+BF*N90lin_dyn)/(1+BF)

6.57

7.35

3.50

4.03

OSISAF

5.28

6.07

3.78

2.51

2.96

-0.66

OSISAF-2

5.32

6.13

3.76

2.47

2.90

-0.66

OSISAF-3

5.04

5.63

5.02

3.18

3.76

-1.41

12830

22091

1457

12830

22091

1457

(BF+BF *N90lin_dyn)/(1+BF ) (BF+BF *N90lin_dyn)/(1+BF )

(BF+N90lin_dyn)/2

Nr. of points

Table 2-1: SIC = 0%, Northern Hemisphere, summer

Algorithm

Standard deviation

Average

Near 90GHz lin, dyn

35.97

20.45

Near 90GHz

34.09

25.19

ASI_NWF

32.38

39.98

P90

33.51

28.09

P37

16.62

1.36

Bootstrap P

16.69

1.40

P18

9.67

0.98

Bristol

8.01

1.36

PR

7.89

1.80

NASA Team

6.42

1.05

NORSEX

4.47

1.46

Bootstrap F

4.46

1.39

CalVal

4.46

1.39

UMass-AES

4.46

1.39

P10

5.08

-4.20

One channel (6H)

2.54

-0.58

TUD

4.94

2.46

(NT+BF)/2

5.28

1.22

(NT+BF+N90lin_dyn)/3

15.40

8.29

(P37+N90lin_dyn)/2

26.15

10.79

(P37+N90lin_dyn+BF)/3

18.64

8.01

2

2

5.30

2.59

3

3

5.01

2.48

(BF+BF *N90lin_dyn)/(1+BF ) (BF+BF *N90lin_dyn)/(1+BF )

page 25 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Algorithm

Standard deviation

Average

19.80

11.45

(BF+BF*N90lin_dyn)/(1+BF)

6.96

3.77

OSISAF

5.04

1.60

OSISAF-2

5.07

1.57

OSISAF-3

5.23

1.84

36378

36378

(BF+N90lin_dyn)/2

Nr. of points

Table 2-2: SIC = 0%, Northern Hemisphere, summer. Average over all the instruments present for given algorithm

The bar-chart in Figure 2-1 and Figure 2-2 is showing Table 2-1 graphically. The algorithms using the 19v – 37v channel combinations have the lowest standard deviations for all three sensors.

Figure 2-1: Standard deviations, SIC = 0%, Northern hemisphere, summer

page 26 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 2-2: Bias, SIC = 0%, Northern hemisphere, summer

2.1.2

Winter (October – May) Table 2-3 and Table 2-4 (and Figure 2-3 and Figure 2-4) are showing the average and standard deviation of the ice concentration over the open water sites in winter (October – May) on the northern hemisphere. The overall pattern is very similar to the summer situation. The differences are reflecting the different algorithms sensitivity to wind and atmospheric humidity and other seasonally changing quantities [Andersen et al. 2006]. These different sensitivities are identified in section 8 using a simulated dataset. Some of the quantities such as water vapour have climatological trends and a small difference between the summer and winter dataset is an asset for an algorithm. For example the Bootstrap F has a low summer – winter difference (0.95 for all data points the bias cannot only be in SIC even though the bias can be adjusted using seasonally varying tie-points.

8 SIC SIC std MPF MPFstd (1-W) 6.9V 6.9H 10.7V 10.7H 18.7V 18.7H 23.8V 23.8H 36.5V 36.5H 89.0V 89.0H

9All 10 avg stdev 0.949 0.028 0.014 0.012 0.229 0.114 0.035 0.020 0.732 0.116 256.20 234.91 255.36 234.06 253.67 232.87 251.50 232.79 237.46 219.59 226.21 215.88

8.66 15.74 8.26 14.33 9.66 12.74 10.76 12.48 17.84 17.51 22.86 21.81

June 11 avg 0.959 0.010 0.137 0.024 0.828

12 stdev 0.020 0.013 0.045 0.012 0.056

262.32 245.21 261.05 243.08 258.47 238.46 254.74 236.00 238.99 221.54 218.75 208.93

4.53 9.61 4.54 9.04 6.06 9.80 7.50 10.39 14.86 15.25 19.57 17.95

July 13 14 avg 0.949 0.016 0.307 0.046 0.657 252.03 228.05 252.24 229.08 253.24 232.02 252.92 234.16 241.52 222.64 235.18 223.98

stdev 0.027 0.010 0.103 0.018 0.098

August 15 avg 0.909 0.021 0.270 0.037 0.664

stdev 0.017 0.008 0.080 0.028 0.074

8.89 16.16 8.02 14.31 6.35 10.64 6.59 9.33 14.44 15.18 22.78 22.69

249.63 223.19 246.29 219.85 237.53 215.42 234.06 215.63 216.18 200.70 219.27 210.47

3.51 7.77 4.44 8.13 12.63 13.16 16.23 16.02 24.00 21.59 22.11 20.88

Table 7-1: Average values and stdevs for MODIS derived SIC and MPF and for AMSR TBs

Table 7-1 shows a summary of SIC and MPF derived from MODIS data split in the 3 summer months. Note the melt pond fraction is 13% in June increasing to 30% in July and 27% in August for the analysed areas.

page 101 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

All avg

C=1-W

June stdev

73.2 11.56

Near90_lin_dyn Near90GHz ASI_NWF P90 P37 Bootstrap_p P18 Bristol PR NASA_Team NORSEX Bootstrap_f CalVal UMass_AES P10 One_channel TUD (NT+BF)/2 (NT+BF+N90lin_dyn)/3 (P37+N90lin_dyn)/2 (P37+N90lin_dyn+BF)/3 (BF+BF*N90lin_dyn)/(1+BF) (BF+BF3*N90lin_dyn)/(1+BF ) (BF+N90lin_dyn)/2 (BF+BF*N90lin_dyn)/(1+BF) osisaf osisaf2 osisaf3 CMY

avg

82.8

July stdev

avg

5.62 65.7 11.29 7.17 4.82 6.34 10.53 16.86 10.84 9.12 15.10 8.14 11.11 11.45 11.45 11.45 9.65 6.66 11.45 8.01 8.45 10.13 9.33 9.38

August stdev

avg

stdev

9.75

66.4 7.35

96.9 97.9 100.1 99.7 88.9 88.8 92.8 104.2 88.5 95.2 112.9 113.2 113.2 113.2 98.3 105.7 113.2 104.2 101.8 92.9 99.7 103.5

10.36 6.53 4.36 5.70 10.17 18.11 10.36 11.10 14.63 9.20 14.62 14.99 14.99 14.99 12.72 10.90 14.99 10.92 9.56 9.63 9.60 9.72

98.2 98.4 100.2 99.7 89.7 89.6 94.1 109.7 91.0 97.7 120.9 121.4 121.4 121.4 104.0 112.9 121.4 109.5 105.8 94.0 103.1 107.5

94.5 96.6 99.5 99.0 87.0 87.1 92.0 100.3 86.9 93.2 107.9 108.1 108.1 108.1 95.2 101.0 108.1 100.6 98.6 90.7 96.5 100.1

9.64 6.20 4.22 5.54 9.99 9.98 10.25 11.56 14.92 10.07 14.91 15.27 15.27 15.27 13.39 11.19 15.27 11.97 10.02 9.26 9.63 9.49

101.0 6.51 100.7 3.37 102.1 1.64 102.1 1.78 93.5 7.22 92.3 36.67 90.5 8.05 99.3 6.17 85.8 9.40 93.7 6.94 102.7 8.83 102.7 9.05 102.7 9.05 102.7 9.05 89.6 10.65 97.6 5.39 102.7 9.05 98.2 6.42 99.1 5.22 97.2 6.57 99.1 5.25 101.4 5.32

102.7 105.0 104.2 104.2 104.2 104.2 46.5

9.23 10.94 10.29 11.10 11.10 11.10 43.49

106.4 9.03 99.6 109.8 10.36 101.3 108.6 9.83 100.7 109.7 9.12 100.3 109.7 9.12 100.3 109.7 9.12 100.3 54.4 39.48 32.3

8.97 10.77 10.10 11.56 11.56 11.56 41.20

101.2 5.23 101.8 5.56 101.6 5.43 99.3 6.17 99.3 6.17 99.3 6.17 72.1 47.02

Table 7-2: Calculated SIC and their stdevs for the tested algorithms. Upper row in red are MODIS derived reference values. The bottom row is multi-year ice concentration as derived with the Nasa Team algorithm

page 102 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

8

Simulated data

8.1

The simulated sea ice concentration Microwave brightness temperatures used for computing the ice concentration are sensitive to noise from the atmosphere and surface emission variability. Even though the sensitivity to noise is minimized in sea ice concentration algorithms in general the estimated ice concentration may still have some sensitivity to noise left. Over open water the dominating noise sources are wind roughening of the water surface, water vapor in the atmosphere and cloud liquid water. Over ice the atmosphere play a minor role except at near 90 GHz where liquid water is a noise source. Over ice the noise is dominated by snow and ice temperature variability, the snow depth and grain size variability, and to some extent snow surface density variability as a proxy for layeing in the snow. Snow layering and surface roughness effects can be investigated on a case by case basis but this is beyond the scope here. These parameters mentioned above are investigated in this chapter. In these simulations the real ice concentration is 1 over ice and 0 over open water. All deviations from 1 and 0 in the estimated ice concentrations are caused by sensitivity to noise. Traditionally weather filters have been applied to avoid noise over open water. A weather filter is an ice - water classifier which is truncating pixels classified as open water to 0 % ice concentration. This removes noise in open water regions and it may remove real low concentration ice or new-ice along the ice edge. It doesn’t work over ice. Some processing facilities e.g. EUMETSAT’s OSI SAF is using explicit correction of the brightness temperatures before computing the ice concentration. This is a spatially/temporally varying noise reduction which is working over both ice and open water. The correction is using NWP data of wind, temperature, and water vapor and an atmospheric radiative transfer model to correct the brightness temperatures. This procedure requires dynamical tie-points to avoid potential biases from the model. Even though there are very good radiative transfer models for the atmosphere it is not possible to correct for all noise sources. For example, the representation of cloud liquid water in NWP models is not suitable for correction. The parameters in the snow and ice are difficult to measure or quantify in numerical models and they have not been used for explicit correction. It is therefore important to find algorithms with low sensitivity to physical parameters which are difficult to correct for. This includes cloud liquid water and snow and ice parameters in general.

8.2

The simulated data Combined thermodynamic and emissivity models have the potential to build long snow/sea ice/microwave time-series that can be used for statistical analysis of radiometer sea ice data sensitivities (Mätzler et al., 2006). However, Wiesmann et al. (2000) show that one-dimensional thermodynamic models for snow and frozen ground including microphysical parameters and a vertical stack of layers, for example, SNTHERM (Jordan, 1991) and Crocus (Brun et al., 1989), underestimate the formation of thin crusts or weak layers in the snow pack. Comparison to snow pit page 103 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

measurements showed that the density of thin layers is underestimated in Crocus and thin layers are not represented properly in SNTHERM. Further, when the thermodynamic model output is used as input to a microwave emission model this leads to underestimation of the simulated polarization difference. These models were developed for other applications such as avalanche risk assessment. The thermodynamic model used here treats layers from individual precipitation events and retains all layers even when thin (1 mm) in an attempt to alleviate earlier problems in microwave modeling applications (Tonboe, 2005). Representing the layering in the snow is very important for simulating realistic Tb’s and in particular the Tv and Th polarization difference (Wiesmann et al., 2000). To ensure reasonable initial snow layer thickness precipitation events less than 1 kg/m2 over 6 h are retained and released only when the next precipitation event exceeds the threshold. While this may not be totally realistic it does produce simulated snow depths which are comparable to climatology (Warren et al., 1999). An earlier investigation of different sea ice concentration algorithms by Tonboe and Andersen (2004) was using measured profiles as input to the emission model thus changing the density and grain size of specific layers in the snow-pack. It showed that the frequency algorithms had low sensitivity to these changes while the algorithms using near 90 GHz channels had a high sensitivity to the density of the snow surface. 8.3

The microwave emission models Sea ice emission models relate physical snow and ice properties such as density, temperature, snow crystal and brine inclusion size to microwave attenuation, scattering and reflectivity. The model used here is a sea ice version of MEMLS (Wiesmann and Mätzler, 1999) described in Mätzler et al. (2006) and hereafter called the emission model. The theoretical improved Born approximation was used here, which validate for a wider range of frequencies and scatterer sizes than the empirical formulations (Mätzler and Wiesmann, 1999). MEMLS, using an empirical scattering formulation, has earlier been validated for snow on land in the 5–100 GHz region (Wiesmann and Mätzler, 1999). The concern at higher frequencies is the validity of this empirical scattering formulation. The theoretical improved Born approximation is in principle also valid in the 100–200 GHz range and for large scatterers (Mätzler and Wiesmann, 1999). Using the improved Born approximation the shape of the scatters is important for the scattering magnitude (Mätzler, 1998). We assume spherical scatters in snow when the correlation length pec, a measure of grain size, is less than 0.2 mm and the scatters are formed as cups when greater than 0.2 mm to resemble depth hoar crystals. The sea ice version of MEMLS includes models for the sea ice dielectric properties while using the same principles for radiative transfer as the snow model. Again the scattering within sea ice layers beneath the snow is estimated using the improved Born approximation. The scattering in first-year ice and multi-year ice is assumed from small brine pockets and air bubbles within the ice respectively. All simulations are at 500 incidence angle similar to AMSR, SSMIS and other conically scanning radiometers. In addition to the surface emission model for sea ice we use a “Wentz” model for simulating the atmospheric emission and absorption and the open water emissivity. The “Wentz” model is an open water model specifically developed for different radiometers SMMR (Wentz, 1983), SSM/I, excluding the 85GHz channels (Wentz, 1997) and AMSR (Wentz and Meissner, 2000). page 104 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Each of them has been modified so that they can be used over ice by including the sea ice emissivity and effective temperature as input. 8.4

The snow and sea ice thermodynamic and mass model In order to produce input to the emissivity model a one-dimensional snow/ice thermodynamic model has been developed. Its purpose is not necessarily to reproduce a particular situation in time and space rather to provide realistic microphysical input to the emissivity model. Earlier thermodynamic models such as Maykut and Untersteiner (1971) or even simple degree day models (see e.g. Maykut, 1986) are useful for simulating ice thickness. However, for microwave emission modeling applications additional parameters such as temperature, density, snow grain size and ice salinity at very high vertical resolution are needed. Because the thermal conductivity is a function of temperature the model uses an iterative procedure between each time step of 6 h. The thermodynamic model is fed with ECMWF ERA40 data input at these 6 h intervals. In return the thermodynamic model produces detailed snow and ice profiles which are input to the emission model at each time step. A “Wentz” model is used for simulating the atmospheric emission and absorption. This gives a picture of significant emission processes in sea ice even though the one-dimensional thermodynamic model is not capturing the spatial variability of the sea ice cover caused for example by ice convergence resulting in deformation, ice divergence resulting in new-ice formation, and wind redistribution of the snow cover affecting snow depth, density and grain size. The thermodynamic model has the following prognostic parameters for each layer: thermometric temperature, density, thickness, snow grain size and type, ice salinity and snow liquid water content. Snow layering is very important for the microwave signatures therefore it treats snow layers related to individual snow precipitation events. For sea ice it has a growth rate dependent salinity profile. The sea ice salinity is a function of growth rate and water salinity (Nakawo and Sinha, 1981).

8.5

Simulation procedure The input at 6 hour intervals to the thermodynamic model is ECMWF ERA 40 re-analysis climate data: air pressure, air temperature, wind speed, humidity, precipitation, incoming short wave radiation, and incoming long wave radiation. The data are used in the model for computing the surface energy balance and snow accumulation. In addition to the parameters which are used for the thermodynamical model also the cloud liquid water and water vapor are extracted from the ERA40 data. These are used as input to the “Wentz” model. The output from the thermodynamic model at 6 hourly time-steps is a snow and sea ice profile including for each layer: The temperature, the density, the correlation length, the salinity, the snow or ice type is input to the snow and ice emission model. The emission model is computing the emissivity and the effective temperature which is used together with the cloud liquid water and water vapor as input to the “Wentz” atmospheric model. Over open water the “Wentz” model is used alone. The output is the top of the atmosphere brightness temperatures which can be used as input to the ice concentration algorithms and all other physical parameters describing the system.

8.6

The simulated data (Antarctic cases)

page 105 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Two sites were selected in Antarctica: One grid point in the Ross Sea 200°E) for a first-year ice floe and one in the Bellinghausen Sea 280°E) in open water. Two sites were selected in the Arctic: One Lincoln Sea (85°N, 240°E) in 100% multiyear se ice and one Norwegian Sea (70°N, 0°E) in open water.

(75°S, (64°S, in the in the

Even though we have identified and constructed about 20 different sea ice concentration algorithms it is clear that there is a limited number of algorithm families where the sensitivity to noise is almost similar. There is the gradient (e.g. CF and CalVal), the polarization (e.g. NT and CP), the high frequency (e.g. ASI_NWF and N90LIN), the single channel (e.g. OneChannel - Tb19h, OneChannel - Tb6h) and the hybrid combining gradient and polarization (e.g. Bristol and TUD) families. The single channel algorithms are not real candidates in the algorithm selection but the OneChannel - Tb6h is included because of its low sensitivity to the atmosphere and the surface emissivity variability. Radiometers measuring at 6 GHz were on SMMR 1978-1987 and on AMSR 2001-2011 and now on AMSR-2 2012 onwards. The ESMR radiometer on NIMBUS 5 covered the important period before modern multi frequency radiometers from 1972 to 1977. It measured at a single channel at Tb19h. The Tb19h channel is included on all multi frequency radiometers from 1978 until today. We have therefore selected the 9 algorithms representing different families in table 1.

Algorithm

Category

NASA

Primarily 19 GHz polarization

Bristol

19 and 37 GHz hybrid

Bootstrap F

19 and 37 GHz gradient

Bootstrap P

37 GHz polarization

TUD

19, 37 and near 90 GHz hybrid

ASI_NWF

Near 90 GHz high frequency

N90LIN

Near 90 GHz high frequency

OneChannel - Tb6h

Low frequency single channel

OneChannel - Tb19h

ESMR single channel

Table 8-1: Categorization of the 9 selected algorithms. The polarization algorithms are using the polarization difference or ratio. The gradient algorithms are using the spectral gradient e.g. at Tb19v and Tb37v. The “hybrid” refers to a combination of polarization and gradient. Here low frequency is 6 GHz and high frequency is near 90 GHz. ESMR is the single channel (Tb19h) radiometer on NIMBUS 5 The surface parameters in snow and sea ice affecting the thermal microwave emission and the estimated ice concentration are a stretch target for NWP and sea ice models at present. The parameters are even difficult to measure in the field and simulate using detailed process models. The complexity of the atmosphere – snow – ice system makes it difficult to identify and define parameters and the parameters are often not independent as seen in the correlation matrix e. g. the average snow correlation length and the snow depth and the snow temperature gradient. Other parameters such as the atmospheric water vapor and the snow surface temperature are also correlated page 106 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Dens Apcc Ti Ist St Snowg

Dens

appc

Ti

ist

St

1.0

Snowg

Vapor

Liquid

0.02

-0.17

-0.42

1.0

0.20

-0.47

1.00

0.09

-0.01

-0.10

-0.03

-0.11

0.65

0.60

0.02

-0.02

0.60

0.27

0.22

0.55

0.72

0.53

1.00

-0.63

-0.48

0.24

0.13

1.00

0.76

0.03

1.00

0.35 1.00

Vapor Liquid NASA

NT

BF

BP

TUD

ASI

N90L

-0.46

-0.50

-0.22

-0.58

0.40

-0.42

0.45

-0.10

0.63

0.31

-0.20

0.48

0.00

0.30

0.28

0.17

0.39

-0.36

0.32

-0.62

-0.02

0.74

-0.57

0.20

0.01

-0.28

-0.02

0.80

0.59

-0.10

0.81

0.34

-0.34

0.53

0.29

0.72

0.62

-0.04

0.80

0.55

-0.40

0.77

0.82

-0.03

0.20

0.20

0.11

0.35

-0.53

0.33

1.00

-0.04

0.11

0.04

0.11

0.30

-0.62

0.38

1.00

0.77

-0.01

0.96

0.57

-0.39

0.77

1.00

0.59

0.81

0.92

-0.51

0.80

1.00

0.01

0.68

-0.18

0.18

1.00

0.65

-0.50

0.87

1.00

-0.57

0.84

1.00

-0.65

Bristol BF BP TUD ASI N90L

Bristol

1.00

Table 8-2: The Ross Sea correlation matrix. The snow surface density: Dens, The average snow correlation length: appc, The snow surface temperature: Ti, The snow ice interface temperature: ist, The snow depth: St, the snow temperature gradient: snowg, The atmospheric water vapor: Vapor, The cloud liquid water: Liquid.

page 107 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 8-1: The simulated snow and ice profile in the Ross Sea (75°S, 200°E)

The ice in the simulated profile from the Ross Sea shown in Figure 8-1 is growing from the initial 2 cm at the beginning of the season to about 150 cm at the end of the cold season. The snow cover gradually accumulates during several snow precipitation events to a thickness of about 40-50 cm.

page 108 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 8-2: The simulated Ross Sea ice concentration

After the initial ice growth and snow accumulation phase the simulated sea ice concentration from the different algorithms gives near 100% estimates as shown in Figure 8-2. It is unclear whether the initial phase is a realistic representation of thin ice or a shortcoming of the thermodynamic model to provide a realistic snow profile to the emission model. Therefore the initial 120 iterations (30 days) have been excluded from the analysis in all ice profiles and we only include mature ice where the simulated ice concentration is near 100%.

page 109 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 8-3: The 9 different ice concentration algorithms sensitivity to the snow ice interface temperature in the Ross Sea profile (see figure 8-1).

Figure 8-3 shows the 9 different algorithms sensitivity to the snow - ice interface temperature. For cold temperatures (3.0) 3. Remove points from Summer (month 5-9 incl for N, and months 1103 incl for S

AMSR (water from Tiepoints_20130117.xlsx e-mail from Leif Jan 17,20130 at 14:41, Ice hard copy 8 Apr, 2013: 2007-2011. MY: 15% lowest 37H months 1-4+10-12. FY: 15% highest 37H months1-4+10-12) SSMI (water: from TP20130404.xlsx e-mail from Leif Apr 7, 2013 at 9:21; ice: hard copy 8 Apr, 2013: 2007+2008) SMMR (water from PVASR (Leif) (first version in Tiepoints_20130117.xlsx e-mail from Leif Jan 17,20130 at 14:41); Ice same as AMSR)

Southern Hemisphere AMSR (water from Tiepoints_latest_S.xlsx e-mail from Leif Jan 31,2013 at 21:51, Ice hard copy 8 Apr, 2013: 2007-2011. MY: 15% lowest 37H months 5-11. FY: 15% highest 37H months 5-11) SSMI (water: from TP20130404.xlsx e-mail from Leif Apr 7, 2013 at 9:21; ice: hard copy 8 Apr, 2013: 2007+2008. MY: 15% lowest 37H months 5-11 only. FY: 15% highest 37H months 5-11 only) SMMR (water from PVASR (Leif) (first version in Tiepoints_20130117.xlsx e-mail from Leif Jan 17,20130 at 14:41); Ice same as AMSR)

page 171 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Open water

NH

w06h w06v w10h w10v w18h w18v w22h w22v w37h w37v w85h w85v

AMSR 82.13 161.35 88.26 167.34 108.46 183.72 128.23 196.41 145.29 209.81 196.94 243.20

NX

170.01

193.19

SSMI 0 0 0 0 117.16 185.04 0 200.19 149.39 208.72 205.73 243.67

NX

171.56

191.87

SMMR 86.49 153.79 95.59 161.81 111.45 176.99 135.98 185.93 147.67 207.48 0 0

NX

162.61

190.80

Table 11-1: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm

SH w06h w06v w10h w10v w18h w18v w22h w22v w37h w37v w85h w85v

AMSR 80.15 159.69 86.62 166.31 110.83 185.34 137.19 201.53 149.07 212.57 207.20 247.59

NX

171.86

196.65

SSMI 0.00 0.00 0.00 0.00 118.00 185.02 0.00 198.66 152.24 209.59 206.12 242.41

NX

171.52

192.94

SMMR 83.47 148.60 93.80 159.12 110.67 175.39 129.63 186.10 149.60 207.57 0.00 0.00

NX

160.77

190.92

Table 11-2: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm

page 172 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

First year Ice

NH fy06h fy06v fy10h fy10v fy18h fy18v fy22h fy22v fy37h fy37v fy85h fy85v

AMSR 232.08 251.99 234.01 251.34 237.54 252.15 236.72 250.87 235.01 247.13 222.39 232.01

NX

SSMI

NX

251.17

238.20 252.79

251.91

244.47

250.46 233.25 244.68 217.21 225.54

241.53

SMMR 232.08 251.99 234.01 251.34 237.54 252.15 236.72 250.87 235.01 247.13

NX

251.17

244.47

Table 11-3: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice

SH fy06h fy06v fy10h fy10v fy18h fy18v fy22h fy22v fy37h fy37v fy85h fy85v

AMSR 236.52 257.04 238.50 257.23 242.80 258.58 242.61 257.56 239.96 253.84 232.40 242.81

NX

SSMI

NX

258.41

244.57 259.92

259.93

252.57

257.85 241.63 254.39 235.76 244.84

253.25

SMMR 236.52 257.04 238.50 257.23 242.80 258.58 242.61 257.56 239.96 253.84

NX

258.41

252.57

Table 11-4: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice

page 173 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

11.2.2

Issue 0.1 / 01 May 2013

Multi-year ice

NH

my06h my06v my10h my10v my18h my18v my22h my22v my37h my37v my85h my85v

AMSR 221.19 246.04 216.31 239.61 207.78 226.26 199.60 216.67 184.94 196.91 178.90 187.60

NX

SSMI

NX

222.11

206.46 223.64

219.20

184.02

216.72 179.68 190.14 173.59 180.55

175.93

SMMR 221.19 246.04 216.31 239.61 207.78 226.26 199.60 216.67 184.94 196.91

NX

222.11

184.02

Table 11-5: Tie-points for Northern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice

SH my06h my06v my10h my10v my18h my18v my22h my22v my37h my37v my85h my85v

AMSR 225.37 254.18 221.47 251.65 217.65 246.10 213.79 240.65 204.66 226.51 197.78 210.22

NX

SSMI

NX

244.39

221.95 246.27

244.59

219.62

242.01 207.57 226.46 200.88 211.98

219.59

SMMR 225.37 254.18 221.47 251.65 217.65 246.10 213.79 240.65 204.66 226.51

NX

244.39

219.62

Table 11-6: Tie-points for Southern Hemisphere used with non-atmospheric corrected TBs, NX columns are tie-points for the NORSEX algorithm. SMMR tie-points for FY and MY ice are set to AMSR tie-points since we do not have RRDP data for SMMR from 100% ice

page 174 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

11.3

Issue 0.1 / 01 May 2013

Tie-points for atmospheric correction tests Analysis of atmospheric corrected data (2008) have been carried out with the same algorithms as the non-corrected data. When possible we have applied tie-points derived from the atmospheric corrected data. However, this is not the case for a number of fixed equation algorithms and they subsequently show biases in SIC. Only the water tie-points have been changed for these tests, the FY and MY tie-points remain the same as before correction.

Tie-points for SIC0 after ATM correction of TBs AMSR avg 6.9V 6.9H 10.7V 10.7H 18.7V 18.7H 23.8V 23.8H 36.5V 36.5H 89.0V 89.0H

stdev 156.27 72.58 163.12 77.93 179.65 92.34 191.14 103.93 209.51 129.98 244.34 175.53

1.95 2.59 1.99 3.55 3.27 6.72 4.61 8.89 5.06 11.14 5.05 13.52

SSMI tb19v tb19h tb22v

Avg 176.22 97.13 181.25

stdev 3.20 6.19 4.49

tb37v tb37h tb85v tb85h

205.4 133.6 239.11 174.62

4.19 9.12 4.78 11.33

Table 11-7: Tie-points SIC=0 used for testing algorithm performance with atmospheric corrected TBs. Same TPs are used for N and S.

11.4

Near 90 GHz algorithms 4 near 90 GHz algorithms have been tested. 3 are nonlinear relationships between polarisation difference and SIC and the 4th is a linear relationship (see figure 11-2). All the nonlinear algorithms saturate around 0 and 100% ice and thus tend to produce small standard deviations in the tests presented in this report. The algorithms in their original implementation have cut-offs beyond the locations of max and min SIC, and in addition some of them also uses a weather filter in the published version. We tested weather filters seperately and thus have decided to present results of the algorithms without cut-offs and weather filters.

page 175 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 11-2: Relationship between P90 (n90V-n90H) and Sea Ice Concentration

The figure above shows the relationship between near 90 GHz polarization difference (V-H) and SIC for the N90 algorithms. The figure can be used to help understand some of the results for these algorithms. In particular the figure illustrates why there is an upper and a lower limit to the SIC derived using the 3 nonlinear versions. We did not set SIC to 0/1 when P90 went beyond the saturation level (the P90 corresponding to the (dSIC/dP90 = 0). This means that e.g. P90 smaller that ca. 8 will result in SIC becoming smaller than1.

Figure 11-3: Relationship between P85 (85V-85H) and Sea Ice Concentration near SIC0 for the 4 near90 algorithms in the form they were tested. page 176 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 11-4: Scatterplot of P90 before atmospheric correction (xaxis) vs P90 after correction (y-axis).

Figure 11-4 shows a scatterplot of P90 before and after atmospheric correction. It is clear that there is a substantial bias between these (as would be expected) but also that the scatter is substantially reduced by the atmospheric correction.

page 177 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

12

Issue 0.1 / 01 May 2013

Instrument drift In order to establish whether there is any instrument drift we have plotted average winter brightness temperatures and sea ice concentrations obtained from selected algorithms covering all the years that are used in RRDP.

12.1

Northern Hemisphere

12.1.1

SIC = 0%

Figure 12-1: SIC = 0%, Northern Hemisphere, average winter sea ice concentrations from selected algorithms. SMMR, SSM/I, AMSR

page 178 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 12-2: SIC = 0%, Northern Hemisphere, average winter brightness temperatures for selected channels. SMMR, SSM/I, AMSR

page 179 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

12.1.2

Issue 0.1 / 01 May 2013

SIC = 100%

Figure 12-3: SIC = 100%, Northern Hemisphere, average winter sea ice concentrations from selected algorithms. SSM/I, AMSR

page 180 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Figure 12-4: SIC = 100%, Northern Hemisphere, average winter brightness temperatures for selected channels. SSM/I, AMSR

page 181 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Near90_lin_dyn

Bootstrap_p

120.00

120.00

110.00

110.00

100.00

100.00

90.00

90.00

80.00

80.00

70.00

70.00

60.00

60.00

50.00

50.00 0

1000

2000

3000

4000

5000

6000

0

1000

2000

Bristol

3000

4000

5000

6000

5000

6000

5000

6000

NASA_Team

120.00

120

110.00

110

100.00

100

90.00

90

80.00

80

70.00

70

60.00

60

50.00

50 0

1000

2000

3000

4000

5000

6000

0

1000

2000

Bootstrap_f

3000

4000

osisaf2

120

120.00

110

110.00

100

100.00

90

90.00

80

80.00

70

70.00

60

60.00

50 0

1000

2000

3000

4000

5000

6000

50.00 0

1000

2000

3000

4000

Figure 12-5: SIC from 2008 SIC1 SSMI dataset before ATM correction. X-axis is sample number. Most points are in the Winterspring. July 1 is 4592. October 1 is 4896.

The time-series in Figure 12-5 is showing the 100% ice covered round-robin data for the northern hemisphere for 6 different algorithms during winter spring and melt-onset. There is no atmospheric correction. The melt-onset is clear to see in all time-series as an abrupt but artificial depression in ice concentration. The Bristol and the Bootstrap - F have an artificial increase in ice concentration towards the melt-onset. This may be caused by accelerated snow metamorphosis and the temperature increase. These page 182 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

artefacts both the melt-onset depression and the spring ramp can be alleviated using dynamical or seasonally varying tie-points. We know that the real ice concentration in all of these data points is 100% and the variability around 100% is an indication of the different algorithms sensitivity to noise. For most algorithms the range of variability is 100% +/- 10%.

page 183 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

13

Issue 0.1 / 01 May 2013

Summary and conclusions An ultimate sea ice concentration algorithm for climate time series has low sensitivity to noise. In particular it has low sensitivity to the noise sources that we cannot correct for and the noise sources that may have climatic trends. When correcting for noise it is important to be able to adjust the offset or tie-points not to introduce artificial trends from the auxiliary data sources e.g. NWP data. Therefore the selected algorithm should not be overly sensitive to the selection of tie-points and it should be possible to adjust the tie-points dynamically to compensate for climatic changes in the radiometric signature of ice and water and to compensate instrument drift. The algorithm should be accurate at intermediate concentrations as well as at low concentrations and at high concentrations. Along the ice edge spatial resolution and sensitivity to new-ice and atmospheric effects is of particular concern. It is also important that the algorithm is using a selection of channels for which the processing of long time-series is possible i.e. 19 and 37 GHz. Finally it is important to be able to quantify the uncertainties. These criteria are to some extent different from the criteria for ice information used for planning of navigation etc. where high spatial resolution is the single most important criterion. The comprehensive algorithm intercomparison study reported in the previous chapters leads to a number of partial conclusions:        

N90 SIC algorithms are very sensitive to atmospheric noise at low SICs Most algorithms improve substantially when TBs are corrected for atmospheric influence The Bootstrap - F algorithm is the best of the simple algorithms at low SIC Bristol algorithm is among the best for high SIC OSISAF like combination of Bootstrap_F and Bristol is a good choice for an overall algorithm After atmospheric correction of TBs SICs over ice will be more uncertain (more noisy) than SIC over the ocean. Thin ice is seen as reduced concentration by all algorithms Melt ponds are seen as water by all algorithms. Differences in (1-W) vs SIC slope between algorithms not yet fully understood

Some algorithms show a pronounced seasonal cycle for SIC=1 due to accelerated snow metamorphosis rising temperatures leading to wet snow which is further increasing TB and subsequently to melt ponds which are decreasing TB. This is true for all algorithms, but in particular for Bristol and Bootstrap-F. Systematic biases in ice concentration can be adjusted using tie-points. The seasonal variability in ice concentration can be alleviated using dynamical tie-points. However, dynamic tie-points requires averaging over a large number of datapoints (to reduce noise) and thus some period of time, but results here show that some of the Summer variability are on quite short time scales (days-weeks) It is clear from these partial conclusions that not one single algorithm is superior in all criteria and it seems that a combination of algorithms such as the OSISAF algorithm is a good choice. Even though the near 90GHz algorithms with their high resolution have good skill over high concentration ice the near 90GHz data is not available before 1991 and the N90 algorithms are very sensitive to noise over open water and near the ice edge. Furthermore, their skill over ice is comparable to e.g. the Bristol algorithm. It is surprising to note that some near 90 GHz algorithms seem page 184 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

less sensitive to the fraction of melt-ponds. However, this is attributed to overestimation and nonlinearities in the algorithms rather than seing melt ponds as ice. Atmospheric correction of the near 90 GHz channels substantially reduces noise in the ice concentration estimate and it is possible to use dynamical tie-points with these algorithms. With its high spatial resolution this means that the near 90 GHz algorithms may be a good choice for planning of navigation in ice infested waters. However, the near 90 GHz algorithms seem too noisy for climate applications. Further, the ASI_NWF and the NASA Team 2 have clear biases at intermediate concentrations. Among the algorithms using 19 and 37 GHz channels the Bristol is the best over high concentration ice and the Bootstrap–F is best over open water. The combination between the two i.e. the OSISAF algorithm is concluded to be the best overall combination although other combinations come close and could be used as well. This (not surprisingly) is the same conclusion that lead to the use of the OSISAF algorithm [Andersen et al. 2007] by EUMETSAT. The advantages of using a set of 19 and 37 GHz algorithms is that the data extends from the fall of 1978 till today and into the future. Using a combination of Bristol over high concentration ice and Bootstrap-F over low concentrations has the following implications: 

       

Over ice the Bristol is sensitive to the snow surface and snow ice interface temperature. Among the ice parameters the temperature is the only one which is quantified in NWP models and may be transformed into the effective temperature at different frequencies. This means that there is a possibility for correction. Bristol has apparently a small sensitivity to cloud liquid water. However, this apparent sensitivity is probably due to a correlation between air temperature and cloud liquid water. The Bristol performance over melting ice is good it has an ice concentration vs. (1 - melt-pond and lead fraction) slope near 1 as expected. The Bristol has a clear seasonal cycle in the ice concentration with static tie-points which means that dynamical tie-points are needed when using Bristol. Since Bristol is using 19 and 37 GHz channels it has lower resolution than the near 90 GHz algorithms. Over open water the Bootstrap-F is among the algorithms with the lowest overall sensitivity to noise including surface temperature, wind, and atmospheric water vapour. In particular the Bootstrap-F is nearly insensitive to cloud liquid water which we cannot correct for. The response of Bootstrap-F to atmospheric correction is a clear reduction in the noise level. The response to new-ice is similar to other 19 and 37 GHz algorithms and comparable to the near 90 GHz algorithms. The spatial resolution of the Bootstrap is coarse compared to near 90 GHz algorithms.

Therefore we recommend an OSISAF like algorithm with dynamical tiepoints and atmospheric correction of TBs. The selection of tie-points should be done using an independent algorithm which is sensitive to melt-ponds e.g. NASAT Team. It is not recommended to use cloud liquid water for atmospheric correction but other parameters (wind speed, water vapour and surface temperature) gives a clear noise reduction. In addition, dynamical tie-points is a prerequisite for estimating the uncertainties. page 185 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

14

Issue 0.1 / 01 May 2013

Acknowledgements A number of people have contributed to producing this ducument. Roberto Saldo provided all the colocated TBs for the entire RRDP. Georg Heygster provided the SMOS thin ice data Anja Rösel and Stefan Kern provided the MODIS melt pond data Gorm Dybkjær provided the ERA Interim data Thomas Lavergne provided a lot of helpful comments and ideas to how to carry out the validation. All of the above helped in the discussion of the results and methods applied. In addition Ludovich Brucker of NASA Goddard and Mohammed Shokr of Environment Canada helped with providing results from applying their algorithms (NT2 and ECICE) to our validation datasets.

page 186 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

15

Issue 0.1 / 01 May 2013

References Andersen, S., R. Tonboe, L. Kaleschke, G. Heygster, and L. T. Pedersen (2007), Intercomparison of passive microwave sea ice concentration retrievals over the high-concentration Arctic sea ice, J. Geophys. Res., 112, C08004, doi:10.1029/2006JC003543. Andersen, S., Tonboe, R., Kern, S., and Schyberg, H. (2006), Improved retrieval of sea ice total concentration from spaceborne passive microwave observations using numerical weather prediction model fields: An intercomparison of nine algorithms, Remote Sensing of Environment, 104, 374-392. Beitsch, A., S. Kern, and L. Kaleschke (2012), Comparison of AMSR-E sea ice concentrations with ASPECT ship observations around Antarctica, IEEE IGARSS’12, Munich, July 23-27, 2012. Brun, E., E. Martin, V. Simon, C. Gendre, and C. Coleou. 1989. An energy and mass model of snow cover suitable for operational avalance forecasting. Journal of Glaciology 35(121), 333-342. Heygster,G., M. Huntemann, H. Wang 2012: Polarization-based SMOS sea ice thickness retrieval algorithm (Algorithm Theoretical BASI_NWFs Document (ATBD)). Technical Report, Institute of Environmental Physics, University of Bremen. Jordan, R. 1991. A one dimensional temperature model for a snow cover. CRREL SP 91-16. Kaleschke, L., X. Tian-Kunze, N. Maaß, M. Mäkynen, and M. Drusch, Sea ice thickness retrieval from SMOS brightness temperatures during the Arctic freeze-up period, Geophys. Res. Lett., 39, L05501, doi:10.1029/2012GL050916, 2012 Kern, S., and G. Heygster (2001), Sea ice concentration retrieval in the Antarctic based on the SSM/I 85 GHz polarization, Annals of Glaciology, 33, 109-114. Kern, S. (2004), A new method for medium-resolution sea ice analysis using weather-influence corrected Special Sensor Microwave/Imager 85 GHz data, Int. J. Rem. Sens., 25(21), 4555-4582. Mätzler, C. 1998. Improved Born approximation for scattering of radiation in a granular medium. Journal of Applied Physics 83(11), 6111-6117. Mätzler, C., P.W. Rosenkranz, A. Battaglia, and J.P. Wigneron, Eds. 2006. Thermal Microwave Radiation - Applications for Remote Sensing, IEE Electromagnetic Waves Series, London, UK. Maykut, G. A. & N. Untersteiner. 1971. Some results from a time-dependent thermodynamic model of sea ice. Journal of Geophysical Research 76(6), 1550-1575. Maykut, G. A. 1986. The surface heat and mass balance. In: N. Untersteiner (Ed.): The geophysics of sea ice. (pp. 395-464). NATO ASI_NWF Series, Plenum Press, New York and London. page 187 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Melsheimer, C., G. Heygster & L. Toudal Pedersen, Retrieval of Sea Ice Emissivity and Integrated Retrieval of Surface and Atmospheric Parameters over the Arctic from AMSR-E data, Journal of the Remote Sensing Society of Japan 29(1) pp.236-241 20090131, The Remote Sensing Society of Japan Nakawo, M., and N. K. Sinha. 1981. Growth rate and salinity profile of firstyear sea ice in the high Arctic. Journal of Glaciology 27(96), 315-330. Ozsoy-Cicek, B., Kern, S., Ackley, S.F., Xie, H., Tekeli, A.E., 2010. Intercomparisons of Antarctic sea ice properties from ship observations, active and passive microwave satellite observations in the Bellingshausen Sea. Deep Sea Research II, doi:10.1016/j.dsr2.2010.10.031 Ozsoy-Cicek, B., H. Xie, S.F. Ackley, and K. Ye, 2009. Antarctic summer sea ice concentration and extent: comparison of ODEN 2006 ship observations, satellite passive microwave and NIC sea ice charts. The Cryosphere, Vol.3 (1):1-9 Pedersen, Leif, 1994, Merging microwave radiometer data and meteorological data for improved sea ice concentrations, EARSeL Advances in Remote Sensing 3, 81-89 Roesel, A., L. Kaleschke, and G. Birnbaum (2012a), Melt ponds on Arctic sea ice determined from MODIS satellite data using an artificial neural network, The Cryosphere, 6, 431-446. Roesel, A., L. Kaleschke, and S. Kern (2012b), Influence of melt ponds on microwave sensor’s sea ice concentration retrieval algorithms, IEEE IGARSS’12, Munich, July 23-27, 2012. Tschudi, M. A., J. A. Maslanik, and D. K. Perovich (2008), Derivation of melt pond coverage on arctic sea ice using MODIS observation, Remote Sens. Environ., 112, 2605–2614. Tonboe, R. T. 2005. A mass and thermodynamic model for sea ice. Danish Meteorological Institute Scientific Report 05-10. Tonboe, R. T., and S. Andersen. 2004. Modelled radiometer algorithm ice concentration sensitivity to emissivity variations of the Arctic sea ice snow cover. Danish Meteorological Institute Scientific Report 04-03. Warren, S. G., I. G. Rigor, N. Untersteiner, V. F. Radionov, N. N., Bryazgin, Y. I. Alexandrov, and R. Colony. 1999. Snow depth on Arctic sea ice. Journal of Climate 12, 1814-1829. Wentz, F. J. and T. Meissner, (2000), AMSR Ocean Algorithm, Version 2, vol. 121599A-1, p. 66, Remote Sensing Systems, Santa Rosa, CA. Wentz, F. J., (1983) A Model Function for Ocean Microwave Brightness Temperatures, Journal of Geophysical Research, 88(C3), 1892-1908. Wentz, F. J., (1997) A Well Calibrated Ocean Algorithm for Special Sensor Microwave / Imager, Journal of Geophysical Research, 102(C4), 8703-8718. Wiesmann, A., and C. Mätzler. 1999. Microwave emission model of layered snowpacks. Remote Sensing of Environment 70(3), 307-316.

page 188 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

Wiesmann, A., C. Fierz, and C. Mätzler. 2000. Simulation of microwave emission from physically modeled snowpacks. Annals of Glaciology 31(1), 397-405. Worby AJ, Comiso JC (2004) Studies of the Antarctic sea ice edge and ice extent from satellite and ship observations. Remote Sensing of Environment 92:98-111.

page 189 of 190

ESA UNCLASSIFIED - For Official Use

Product Validation & Algorithm Selection Report (PVASR) Ref. SICCI-PVASR

Issue 0.1 / 01 May 2013

< End of Document >

page 190 of 190

ESA UNCLASSIFIED - For Official Use