Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:10-9:50,
Poor-man's ensembles have been used as a method of capturing forecast
uncertainty for a number of years. Through using different forecast models they provide
a sample of the model uncertainty that affects a given forecast. By using different
resolution versions of the same model it is possible to reproduce some of the results
seen in a poor-man's ensemble. This derives from the fact that models run at different
resolutions are nearly as skilful as each other, but can produce quite different
forecasts. These differences indicate that it is possible to produce a combined product
which is more skilful the high-resolution forecast alone.
The greater skill of forecasts at multiple resolutions is shown in the context of
a global forecasting system, and in a regional forecasting system. For the latter of
these two combining forecasts from systems already run at the Met Office can lead to
substantial improvements in the deterministic forecast quality - around 8%. These
results suggest that ensemble-based approaches can produce deterministic forecasts that
are substantially more skilful than forecasts from deterministic systems.
It may be argued that comparing an ensemble-based forecast with a deterministic forecast
is not a reasonable comparison. In order to answer this, tests are described in which
an analysis is performed at the ensemble resolution, and a fraction of the ensemble
members are centred around this. This approach should allow the improved ensemble mean
performance seen in the above tests whilst producing a reliable ensemble.
Adventures in resolution
UK Met Office
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:50-10:30,
We further describe the initial perturbations generated by the Ensemble Transform (ET) and ET with rescaling methods and compare with operational breeding ensemble in NCEP operational environment. Different rescaling strategies are experimented and compared.
Both ET and ET with rescaling belong to the second generation methods and generate the initial perturbations that are consistent with the operational data assimilation (DA) system. The initial analysis error variance from DA can be used to restrain the initial perturbations that are orthogonal with respect to an inverse analysis error variance norm. The simplex transformation (ST) imposed ensures that the initial perturbations are centered and span a subspace with maximum number of degrees of freedom.
We also describe the ways of estimating the analysis error variance from NCEP GSI data assimilation system and from multi-center analysis data.
Results will be shown.
ET based initial perturbations and analysis error estimation for NCEP global ensemble forecast system
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 10:30-11:00,
The Met Office Global and Regional Ensemble Prediction System (MOGREPS) uses an online inflation factor calculation to calibrate the spread of the ensemble in space and time and counteract the tendency of the Ensemble Transform Kalman Filter (ETKF) to underestimate analysis uncertainty. Until 2008, this calibration mechanism relied entirely on sonde data, and was only applied locally in the extratropics. By producing more appropriate estimates of the observation error variance of ATOVS brightness temperatures, it has become possible to apply localisation uniformly over the globe, without the previous ad-hoc fixes. As a result, the distribution of spread as a function of latitude is much improved, especially in the tropics. More recent work has used estimates of ATOVS weighting functions to permit localisation in the vertical, reducing the degree of underspread near the surface and providing better control of potentially harmful perturbations near the top of the model.
Improving the use of observations to calibrate ensemble spread
J Flowerdew & NE Bowler
UK Met Office
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 11:30-12:00,
A significant proportion of the model error in the boundary layer capping inversion comes from a positional error in the height of the inversion.
Even though a model's forecast (the background state in data assimilation) may have a realistic structure in the vertical, a small positional error can lead to a large disagreement between the observed and background temperatures in this region. Difficulties arise in the assimilation of these temperature profiles using variational techniques due to static background variances and covariances prohibiting the large and sharp analysis increment which is necessary to give an analysis with an accurate inversion structure and height. It is difficult to model the background correlation structure associated with the boundary layer due to a strong flow dependence. It is important to use a physically reasonable background error correlation structure because an inaccurate one leads to information from the observations erroneously spreading across the inversion. This is a particular issue when there is a disagreement in the height of the inversion between the background and the observations. Using the Met Office MOGREPS ensemble product, a flow-dependent background error covariance matrix can be obtained. This has then been used in conjunction with a new scheme to introduce a positional error associated with the background inversion to give an improved analysis of the inversion. This additionally has an impact on the diagnosis of boundary layer cloud, which is an important feature within a forecast.
Positional error in the atmospheric boundary layer capping inversion
University of Reading
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 12:00-12:30,
Data assimilation provides techniques for combining observations and prior model forecasts
to create initial conditions for numerical weather prediction (NWP). The relative
weighting assigned to each observation in the analysis is determined by its associated error.
The optimal utilisation of high resolution satellite observations in NWP is being hindered
by the mis-specification of their error correlation structure. Variational data assimilation
algorithms are run under the assumption of uncorrelated observation errors; yet the
contrasting model and satellite observation resolutions make this assumption unrealistic.
Treating observation errors as independent requires significant thinning of the data, and
much of the available fine scale information is lost. To continue with the recent advances
in high resolution forecasting, an alternative approach to dealing with observation error
correlations is needed.
The main issues with taking account of the error correlations from remote sensing data
are summarised as: quantifying the correlations, optimising computational efficiency, and
ensuring a well-conditioned problem.
Using a post analysis diagnostic derived from variational data assimilation theory, we
quantify cross-channel correlations between IASI (infrared atmospheric sounding interferometer)
observations used in the Met Office incremental 4D-Var assimilation scheme.
Diagnosed error covariances are given for the pre-processing 1D-Var assimilation and the
main 4D-Var assimilation. Comparisons are made with the current operational
Quantifying observation error correlations in remotely sensed data
University of Reading
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 12:30-1:00,
Currently, most implementations of 3D and 4D-Var are based on simple models of the background error covariance structure, with static parameters estimated from seasonal NWP statistics. Although 4D-Var brings in a degree of flow dependence, the true "errors of the day"
remain poorly represented.
Given an ensemble system able to estimate short-range background error covariances, VAR can be modified to make use of them, either alone or in combination with the static covariances. At the Met Office, a hybrid system of this kind was developed during the late 1990's, using additional "alpha" control variables to control the introduction of ensemble-based information. However, the work was put on hold for lack of a good source of error modes. Following the recent operational implementation of the Met Office's MOGREPS ensemble, this work is now being resurrected in our global 4D-Var system, with the aim of implementing operationally within the next year or so. After describing the basic design of the system, early results from stand-alone tests will be presented, highlighting some of the issues that will need to be resolved prior to implementation.
Development of a hybrid variational/ensemble DA system at the Met Office
Adam Clayton, Dale Barker & Andrew Lorenc
UK Met Office
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 2:00-2:30,
This talk will describe the use of stochastic space-time models to generate ensembles of rainfall forecasts that are conditioned on the current observed rainfall field and a NWP rainfall forecast. The rate of change of the rain field in Lagrangian coordinates is calculated for a number of spatial scales and this information is used to calculate the error in an advection forecast as a function of scale and lead time. The error in the NWP forecast for the current rain field is estimated as a function of scale and the advection and NWP forecasts are blended together to produce the deterministic component of the nowcast. The skill of the deterministic forecast is calculated and an ensemble is generated by perturbing the deterministic forecast with stochastic noise so that the statistical structure of the observed rainfall is reproduced in each ensemble member and the width of the ensemble represents the forecast uncertainty.
Ensemble rainfall nowcasting using stochastic models
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 2:30-3:00,
A new probabilistic data assimilation method for short-term rainfall forecasting from radar observations is introduced. The spatial model relies on a decomposition of the observed rainfall field into precipitation areas ('cells'). The characteristics of the 'cells', that is the parameters of the model, are given relevant priors and their estimation is performed within a Bayesian framework. The cell parameters are estimated using an approximate variational Bayesian method, in which the Kullback Leibler divergence between the approximate posterior (known up to its parameters) and the true posterior (given by Bayes rule, known up to a constant) is minimised. Variational Bayesian methods could be thought of as a probabilistic version of 3D VAR assimilation. The cells advection is represented as a smooth vector Gaussian process, for which the posterior distribution is estimated in a Kalman filter-like fashion, using the cells' displacements over time as pseudo-observations. The model is tested on real radar data from the UK Met.
Office, both in a convective and a frontal setting. The performance of the model is assessed using standard and probabilistic validation methods.
Overall, the model shows very good assimilation skill and reasonable forecasting skill given the simplistic forecasting method used. Extensions are discussed.
Approximate Bayesian Precipitation Forecasting
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 3:30-4:00,
Various statistical methods are used to process operational Numerical Weather Prediction (NWP) products with the aim of reducing forecast errors and they often require sufficiently large training data sets. Generating such a hindcast data set for this purpose can be costly and a well designed algorithm should be able to reduce the required size of these data sets. This issue is investigated with the relatively simple case of bias correction, by comparing a Bayesian algorithm of bias estimation with the conventionally used empirical method. As available forecast data sets are not large enough for a comprehensive test, synthetically generated time series representing the analysis (truth) and forecast are used to increase the sample size. Since these synthetic time series retained the statistical characteristics of the observations and operational NWP model output, the results of this study can be extended to real observation and forecasts and this is confirmed by a preliminary test with real data.
By using the climatological mean and standard deviation of the meteorological variable in consideration and the statistical relationship between the forecast and the analysis, the Bayesian bias estimator outperforms the empirical approach in terms of the accuracy of the estimated bias, and it can reduce the required size of the training sample by a factor of 3. This advantage of the Bayesian approach is due to the fact that it is less liable to the sampling error in consecutive sampling. These results suggest that a carefully designed statistical procedure may reduce the need for the costly generation of large hindcast datasets.
An assesment of Bayesian Bias Estimator for Numerical Weather Prediction
Joohyung Son, D. Hou and Z. Toth
Monday 16 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 4:00-4:30,
Ensemble Prediction Systems based around the Bureau's GASP and LAPS operational global and regional NWP systems were developed in BMRC from the mid 1990's until the early to mid-2000's. The GASP EPS system has been running in operations since 2001. The LAPS EPS was run for several years in daily running research mode, until 2007. With the change in research direction to the UKMO Unified Model based ACCESS system since 2006, an ACCESS version based on the Met Office combined global and regional ensemble prediction system (MOGREPS) has been implemented, with the name tag AGREPS for the ACCESS Australian Region version. The design and history of these ensemble systems will be described, and future plans for AGREPS in research and operations.
GASP-EPS, LAPS-EPS and AGREPS: Global and Regional Ensemble Prediction Systems based on BMRC and ACCESS NWP systems
Michael Naughton, Kamal Puri & Terence O'Kane
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:10-9:50,
A 4-dimensional variational ensemble data assimilation (DA) scheme is introduced that (a) find the analysis corrections in a global variational solve, and (b) uses flow adaptive propagating error covariance localization in order to allow for long-time window DA. The scheme is based on a new method for inexpensively generating the square root of an adaptively localized global 4-dimensional error covariance model in terms of products or modulations of smoothed ensemble members with themselves and with raw ensemble members. The columns of the square root of this matrix may be interpreted as members of a modulation ensemble. For a 100 member raw ensemble, the modulation ensemble contains one million members. With the help of a global numerical weather prediction (NWP) model we (a) show that this million member ensemble provides a plausible model of 4-dimensional forecast error covariances (b) show that the initial time error covariance model provided by this covariance model is superior to thatr used by the operational (3D-VAR) DA scheme of the US Navy, and (c) show that the statistical Tangent Linear Model (TLM) implied by the 4D covariances is of comparable accuracy to those used in 4D-VAR data assimilation schemes.
Data Assimilation Using Modulated Ensembles
Craig Bishop & Daniel Hodyss
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:50-10:30,
The latest major change of the ECMWF Ensemble Prediction System (EPS) was implemented in March 2008, when the 15-day variable resolution ensemble (VAREPS) was merged with the coupled monthly forecasting system into a new, seamless system ranging from day 0 to day 32. This seamless-EPS has a TL399L62 resolution up to day 10 and a TL255L62 resolution thereafter, it uses persisted sea surface temperature anomalies up to day 10, and a coupled ocean model from day 10 (daily to day 15 and on Thursdays to day 32) at 00 UTC. The implementation of the seamless system included also a re-forecast suite that provides users with enough data to estimate the model climate distribution, and thus to correct the forecast distribution using re-calibration methods. In the first part of this talk, the current status of the ECMWF ensemble system will be presented, and the potential value of the calibration data-set will be illustrated.
Work is in progress to finalize the next EPS major change, planned for 2009, which will revise the method used to define the initial perturbations. The plan is to improve the simulation of forecast uncertainties due to errors that have been growing during the data-assimilation cycle with an ensemble of 4-dimensional variational analyses generated by perturbing observations in each data-assimilation cycles. In the second part of this talk preliminary results from these on-going tests will be discussed.
Ensemble Data Assimilation and Prediction at ECMWF
R Buizza, T Palmer, R Hagedorn, L Isaksen, M Leutbecher & F Vitart
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 10:30-11:00,
The calculation of the error covariance of fields described by an evolving nonlinear system requires the solution of an infinite hierarchy of moment equations; this problem is formally identical to the closure problem that arises in statistical approaches to turbulence. Typically Kalman filter methods simply discard cumulants of third order and higher, although these cumulants are often required to ensure that regime transitions in nonlinear models are accurately tracked (Miller et al. 1994). More generally it would seem to be desirable to develop a tractable data assimilation scheme that incorporates information about the higher order cumulants. To this end we formulate a statistical dynamical Kalman filter (SDKF) methodology and compare its performance to both stochastic and deterministic ensemble data assimilation methodologies (OKane and Frederiksen; Entropy 2008, 10, 684-721; DOI: 10.3390/e10040684). The SDKF employs a quasi-diagonal direct interaction closure developed for the general statistical problem of mean fields interacting with inhomogeneous turbulence and topography with prognostic equations for the statistics of the mean field and the inhomogeneous covariance.
A statistical dynamical Kalman fiter for geophysical flows
Terence J. O'Kane & Jorgen S. Frederiksen
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 11:30-12:00,
The adoption of the Met Office 4dVAR data assimilation system for ACCESS has been a dramatic change for the Bureau of Meteorology on two counts. Firstly on skill, and secondly in terms of culture and philosophy.
The new system has clearly yielded immense gains in forecast skill across all scales. There are many reasons for this, and exploring these reasons provides valuable insights for other groups developing assimilation systems. While both Ensemble Kalman Filters and 4dVAR systems have their strengths it is important to acknowledge the full cost of implementing these systems. While 4dVAR systems are more costly to develop, the difference may not be as great as is occasionally claimed.
This talk will provide a summary of the gains made with the ACCESS 4dVAR system as an opportunity to raise some of the issues involved in producing high quality, operationally robust systems that are sometimes ignored during comparisons between EnKF and 4dVAR systems.
The introduction of 4dVAR at the Bureau of Meteorology
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 12:00-12:30,
A major limitation of the EnKF is that the finite ensemble size introduces sampling error into the background covariances, with severe consequences for practical applications. The negative effects of sampling error are customarily limited by covariance localisation, which earlier studies have suggested may introduce imbalance into the system. The deleterious effects of localisation upon balance are confirmed and detailed here, with localisation producing analyses with weaker geostrophic balance and stronger divergence than those obtained using the unlocalised covariances. These imbalances reduce as the localisation radius is increased, but are argued to be large for typical settings.
An improved method for calculating local covariances from an ensemble is presented, in which the localisation is performed in streamfunction velocity potential (psi-chi), rather than wind component, space. Analyses using this method fully preserve the balances contained within the unlocalised covariance model. This transformation further allows the option of intervariable localisation, in which the cross-covariances involving chi, that are weak and therefore particularly subject to sampling error, are set to zero instead of being calculated from the ensemble. The various localisations are compared in a series of identical-twin experiments, and the new localisations produce analyses that are better balanced and significantly more accurate than the usual approach. The localisation with the chi cross-covariances zeroed is shown to be superior for the smaller ensemble sizes but not for the larger, implying that the larger ensembles are capable of resolving some of the true chi cross-covariance in the test system.
Covariance localisation and balance in an EnKF
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 12:30-1:00,
In this work, several data-assimilation methods are compared on two chaotic systems. An extended Kalman filter, an ensemble Kalman filter, a particle filter and 4D VAR (strong and weak constraint) are tested first on the Lorenz '63 system (3 variables), then on the Lorenz '96 system (40 variables). The effects of observation frequency and model non-linearity on the accuracy of each method are discussed. In particular I address issues with applying particle filtering methods in high-dimension, showing that particle filters quickly require unrealistically large numbers of model evaluations. Some extensions towards more efficient particle filters are discussed, in particular the use of emulators as cheaper surrogates for the real model, allowing for larger ensemble sizes to be handled. The essential idea of constructing emulators is the represent the model as an input - output mapping, which is learnt with uncertainty, using a Gaussian process representation. The particles are thus propagated using the Gaussian process approximation, which introduces some additional uncertainty to the problem.
Mechanisms for working correctly with this additional uncertainty are discussed and initial results are presented.
Emulation-based ensemble filtering
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 2:00-2:30,
The dynamic variability and seasonal predictability of an intermediate complexity coupled ocean-atmosphere model is examined in multi-decadal integrations. The model comprises a two-level global atmosphere and two-level Pacific basin ocean. The model has a realistic climatology, and ENSO-like variability, for a range of the models parameters, and even at relatively low resolution. The model is relatively computationally efficient which makes it ideally suited for exploring ensemble forecasting strategies for seasonal prediction in both idealized and hindcast modes. In this paper, we briefly describe the model and discuss the climatology and dynamic variability of the model as a function of the models parameters. We also present results from single run hindcast simulations during the period 1981 to 2000. In an accompanying paper, we show how the model can be used to design and test ensemble prediction strategies.
ENSO-like Variability and Predictability in a Simple Ocean-Atmosphere Model
Carsten S. Frederiksen, Jorgen. S. Frederiksen, Ramesh. C. Balgovind
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 2:30-3:00,
The skill of seasonal ensemble prediction with an intermediate complexity coupled ocean-atmosphere model is examined in hindcast simulations during the period 1992 to 2000. The coupled model consists of a global atmospheric model coupled to a Pacific basin ocean model. An ensemble prediction scheme based on perturbations that grow fast over a period of a month is used to generate the perturbations to the analyses; the analyses are obtained by nudging the ocean fields towards observations. The perturbations contain coupled modes with ocean temperature fields that peak in the equatorial regions and atmospheric fields characterized by large scale teleconnection patterns. Comparisons have been made between the skill of ensemble mean forecasts with the skill of control forecasts in 12 month hindcasts started each month for the period 1992 to 2000. We have examined how ensemble skill depends on the number, type and amplitude of ensemble perturbations. We have also examined how skill depends on the annual and ENSO cycles. In general the seasonal ensemble scheme results in ensemble mean forecasts that are significantly more skilful than the control forecasts as well as providing error estimates of the forecasts.
Seasonal Ensemble Prediction with a Coupled Ocean-Atmosphere Model
Jorgen S. Frederiksen, Cartsen. S. Frederiksen, Stacey. L. Osbrough
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 3:00-3:30,
POAMA (Predictive Ocean Atmosphere Model for Australia) is an intra-seasonal to inter-annual climate prediction system based on a coupled ocean and atmosphere general circulation model. The first version, POAMA-1 with an ocean data assimilation system based on univariate optimal interpolation (OI) approach became operational in October 2002.
A new ocean data assimilation system, called PEODAS (POAMA Ensemble Ocean Data Assimilation System), has been developed for POAMA-2. It is based on 3D multivariate ensemble OI and time dependent error covariances calculated from a time evolving model ensemble. A description of the scheme and evaluation of the PEODAS re-analysis as well as its impact on the seasonal forecasts of ENSO will be presented and discussed.
POAMA Ocean Reanalysis for Dynamical Seasonal Prediction
Yonghong Yin, Oscar Alves, & Peter Oke
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 4:00-4:30,
The Bluelink forecast and reanalysis system is comprised of a high resolution ocean general circulation model and an ensemble optimal interpolation (EnOI) system. The Bluelink system has been integrated for a series of 15-year reanalyses and has been run operational at the BoM for several years. This system has performed robustly, demonstrating reliable skill that is comparable to other ocean forecast systems around the world. One of the key ¨home grown¸ components of Bluelink is the EnOI system. This system has proved to be a valuable and flexible tool for both operational and research applications. Drawing on Bluelink outcomes and a series of experiments with toy models, the case for using EnOI for earth systems applications will be made. This will include a discussion of the benefits of EnOI as well as its known limitations. EnOI is robust, flexible, portable, computationally affordable, is not burdened with the technical difficulties that some other methods carry, and is readily adapted for coupled systems. Perhaps EnOI is not a bad choice for earth systems data assimilation, particularly coupled data assimilation.
Ocean Data Assimilation: making a case for ensemble optimal interpolation
Tuesday 17 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 4:30-5:00,
The BLUElink Ocean Data Assimilation System (BODAS) is an Ensemble Optimal Interpolation (EnOI) scheme initially tested within BRAN (BLUElink ReANalysis) and routinely used to produce analyses in OceanMAPS forecasting system. The operational version of OceanMAPS is capable of assimilating data from different sources, including satellite altimetry from a variety of platforms and satellite derived Sea Surface Temperatures (SSTs). Recent developments have included a new SST data stream into BODAS. The NAVOCEANO SST product has been chosen for its accuracy and because its coverage complements well that of AMSRE SST, which was already being assimilated into BODAS. This new implementation is presently being tested with views of integrating it into the operational system if it shows improvement. We will compare the two assimilated SST sets and the respective resulting analyses through the use of independent observations, with an interest on the physics of the system.
Including a new data stream in BLUElink Ocean Data Assimilation System
Isabel Andreu-Burillo, Gary Brassington, Helen Beggs, & Peter Oke
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:10-9:40,
Based on the current Australian Bureau of Meteorology data assimilation infrastructure, an Ensemble Kalman Filter (EnKF) for global data assimilation has been developed in this study. A 64-member ensemble with a T119L19 GASP forecasting model, utilising the global conventional sounding and satellite ATOVS observational data, has been used. The purpose of this research is to test basic ability and performance of EnKF compared to Generalised Statistical Interpolation (GenSI) scheme on the global scale data assimilation. An important feature of our study is that a hybrid 1D-Var and EnKF scheme has been applied to incorporate satellite 1D-Var technique using NOAA-15 and 16 into ensemble data assimilation. As a first step, ensemble mean fields have been used in the 1D-Var for satellite radiance retrieval as a background field, which avoids the difficult problem of vertical localisation of the covariances during radiance assimilation.
Detailed forecast verification for our EnKF system is performed over both the observation locations and model grids. The verification results from forecasting of EnKF shown similar or better level of model predictability compared to current operational used GenSI scheme when run with the same model resolution. Furthermore, the test demonstrates by applying 1Dvar incorporate into EnKF can improve the forecasting accuracy; particularly for upper atmosphere, although our hybrid Var-EnKF system is relatively simple.
Global EnKF Data Assimilation System: Some initial verification Results
Xudong Sun, Jeff Kepert & Peter Steinle
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 9:40-10:10,
We consider the situation where data assimilation is required for a
system which has some variables which are directly observable and
others for which only some integrated information such as its
variance is available. We present a simple method to incorporate
information of the variance of some unresolved scales into a Kalman
filter. We illustrate our method with the Lorenz-63 and the Lorenz-96
Variance Constraining Kalman Filters
Lewis Mitchell & Georg Gottwald
University of Sydney
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 10:10-10:40,
This study presents a new rationale for medium to long-term forecasting of streamflow at multiple locations in a catchment using a dynamic model combination approach along the lines presented in Chowdhury and Sharma . The dynamic model combination is presented as a pair-wise combination of multiple forecasting models using a hierarchical structure than aims to pair the models on the basis of the lowest commonality they contain. Unlike the more established ensemble averaging approaches, the proposed approach is dynamic, in that the combination weights vary with time and are predicted using a statistical model that mimics persistence in the model weights using an autoregressive model structure. Use of such a dynamic combination enables individual constituent models to be represented to varying extents, thereby enabling forecasts that can rely heavily on models that exhibit high accuracy on an inconsistent basis (accuracy in specific seasons, or over more extended segments of time).
The dynamic model combination is performed in two stages. In the first stage, three models (one dynamical and two statistical) for forecasting global sea surface temperature anomalies (SSTAs) are dynamically combined. The resulting forecasts are found to offer improvements over any single SSTA forecasting approach in over 90 percent of the grid cells. These dynamically combined SSTA forecasts then form the basis for predicting (concurrently) streamflow at five locations in the Namoi basin in Australia using three statistical approaches. These three approaches are designed to have significantly different model forms and associated SSTA predictor variables. These concurrent forecasts and next combined using the dynamic combination approach, and overall improvements documented.
It is found that the dynamic combination of the streamflow forecasting models, while offering an overall improvement in the predictive accuracy as compared to any single forecasting scheme, offers smaller improvements as were obtained in the case of the SSTA dynamic combination forecasts. Our presentation concludes with suggestions on why this is so, offering insights for future studies on when model combination is most appropriate to use.
Chowdhury, S. and Sharma, A., 2009. Long Range NINO3.4 Predictions Using Pair Wise Dynamic Combinations of Multiple Models. Journal of Climate. vol 22 (3), doi: 10.1175/2008JCLI2210.1.
Hydro-climatological Model Combination Should it be Static or Dynamic?
Shahadat Chowdhury & Ashish Sharma
NSW Department of Water and Energy
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 11:00-11:30,
This talk will briefly outline some of the theoretical assumptions implicit in multi-model ensemble modelling, with view to stimulating a discussion about issues such as model independence, the relationship between independence and performance, and differences in our understanding of these concepts in coupled and uncoupled environments.
Conceptions of model independence in coupled and uncoupled multi-model ensembles
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 11:30-12:00,
In this presentation a phase-based verification technique will be discussed. The methodology is based on a variational minimization technique using a quadratic cost function, having obvious links to variational assimilation. In the technique a deformation matrix is recovered that minimizes the distance between two sets of input. The resulting deformation can then be used for two purposes: 1) detecting phase/position errors and 2) correcting them. While both uses have merit, focusing on either will benefit both as they are intimately related.
Whilst not being an actual implementation of EPS or VAR, the methodology is related to future improvements in assimilation and presents a slight variation on papers already published in the EPS field. The methodology will be used to verify a set of synthetic ensemble forecasts and highlight the importance of scale-separation for mesoscale verification and assimilation.
Mesoscale phase-based verification with implications for mesoscale assimilation
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 1:00-1:40,
Ensemble forecasts are an increasingly important source of numerical guidance for weather prediction on a variety of time scales. In order to appropriately interpret ensemble forecasts for weather prediction or use them in downstream applications such as flood forecasting or industrial decision making, it is important to understand their strengths and weaknesses. Accurate ensemble predictions should have high skill (the forecasts are close to the observations) and appropriate spread (the forecasts represent the true uncertainty). In addition, we want probabilistic forecasts derived from the ensemble to be both reliable (give unbiased probability estimates) and have good resolution (can distinguish between observed events and non-events). Measuring these qualities requires a large number of samples.
There are several challenges in verifying ensemble forecasts. The first involves their appropriate use. Should the ensemble mean be used/verified even though it is not a possible weather state? Can individual probability forecasts be verified it is certainly tempting to do so for high impact weather events. These are both controversial questions and there is no consensus within the ensemble community. Another challenge concerns the verification of rare or extreme events, for which probabilistic forecasts are extremely useful. The sample sizes for rare events may be too small to get robust verification results, and the observations of rare/extreme events may be inaccurate due to insufficient sampling or instrument error. A third challenge is conveying the skill of ensemble and probabilistic forecasts to users. The standard diagrams and metrics for verifying probability forecasts are more complicated than those used for deterministic forecasts, and explaining them to non-experts is not easy. A recent study in the US noted that many forecasters would like to see more intuitive verification of ensembles, including object-based approaches.
These and other challenges will be discussed in the talk, and illustrated using ensemble forecasts for Australian heavy rain events.
Challenges in verifying ensemble forecasts
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 1:40-2:20,
Climate and hydrological forecasts are inherently uncertain. Probabilistic forecasts are needed to express that uncertainty. Verification of probabilistic forecasts is a challenging task as there are many facets to it. There has been considerable experience in the verification of probabilistic forecasts of binary and categorical events, but much less experience in the verification of probabilistic forecasts of continuous variables. With the development of more sophisticated dynamic and statistical models that provide large ensembles to represent the probability distributions of continuous forecast variables, there is a need for systematic verification of probabilistic forecasts of continuous variables. In this talk, we present a suite of methods that we have adopted, adapted and devised for the verification of probabilistic forecasts of continuous variables. Methods for overall verification include the use of forecast skill scores based on the linear error in probability space score (LEPS), the continuous ranked probability score (CRPS) and the root squared error in reference probability (RSERP), and the use of a PIT (probability integral transform) uniform probability plot. Methods for detailed verification include comparisons of forecasts with observed data for individual cases, in terms of both quantiles and PIT values, to assess quality of forecasts both over time and over event size. We will present an application of the verification methods to streamflow forecasts produced using Bayesian joint probability (BJP) modelling.
Methods for verification of probabilistic forecasts of continuous variables
QJ Wang & David Robertson
CSIRO Land & Water
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 2:20-3:00,
The Met Office MOGREPS ensemble forecasting system became formally operational in September, 2008, having already run for several years in trials. The Met Office also makes use of the ECMWF EPS for its medium-range forecasting activities. The applications of these ensembles are wide. Examples include warnings of severe weather events and estimating the uncertainty of a particular forecast. In this talk, I aim to show the range of area-based and site-specific products produced using MOGREPS and ECMWF ensemble data. I will show how they are used by customers to make weather-based decisions. Examples will include an automated warning system of forthcoming severe weather events such as heavy rain or gales. Further examples of multimodel forecasts and tropical cyclone track forecasts will be presented. I will also give some examples of how forecasters use the ensembles in their assessment of the most likely outcomes and risks of high impact weather.
Ensemble forecast applications for forecasters and external customers at the UK Met Office
UK Met Office
Wednesday 18 February, 6th Floor, Conference Room 3 (SW corner), 700 Collins, 6:00-7:00,
Probabilistic forecasts, based on ensemble prediction systems developed applying chaos theory, are now part of daily weather products, and their use by weather forecasters and third parties is increasing. What are they? What is their value? Are they the future of weather forecasting?
Chaos theory describes the behaviour of dynamical systems that are highly sensitive to initial conditions (the butterfly effect). As a result of this sensitivity, the behaviour of chaotic systems appears to be random, even though these systems are deterministic, i.e. their future dynamics can be predicted starting from their initial conditions. The atmosphere is such a system: even very small initial condition errors can amplify very rapidly and reduce the accuracy of single weather forecasts.
Following the pioneering work of Edward Lorenz in the 60s, chaos theory has been applied to numerical weather prediction to develop a new, probabilistic approach to weather prediction designed to generate not only one, single forecast, but an ensemble of forecasts that could be used to estimate the forecast uncertainty. Ensemble prediction systems became part of the operational suites at the major meteorological centres in the early 90s.
Today, every day 10 centres generate about 450 medium-range, global ensemble forecasts valid up to 15 days. Many more centres run higher-resolution, short-range ensemble prediction systems over limited regions for up to few days, and few centres run global, lower resolution forecasts for up to 15 months. Ensemble-based methods are also used in data-assimilation, to estimate more precisely the current state of the atmosphere.
Chaos, ensembles and weather prediction
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 9:10-9:50,
The Australian Water Availability Project (AWAP) monitors the state and trend of the terrestrial water balance of the Australian continent, using a model called WaterDyn. We use 3 different methods to estimate the parameters in WaterDyn - a down-gradient method (Levenberg-Marquardt), a global search method (genetic algorithm) and a sequential method (the Ensemble Kalman filter). We compare the parameters estimated by the different methods, and use the Kalman filter to quantify the uncertainty in calculated model variables due to uncertainty in the parameters. Equifinality is an issue, as is horizontal heterogeneity in parameters due to landscape variability. The observations we currently use to estimate parameters are monthly-mean streamflow observations in unimpaired catchments, and with a daily model time-step the assimilation of monthly mean streamflow measurements in the (sequential) ensemble Kalman filter requires special consideration.
Parameter estimation in the Australian Water Availability Project
Cathy Trudinger, Michael Raupach & Peter Briggs
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 9:50-10:30,
The Ensemble Kalman filter (EnKF) has proven a robust variant among existing Kalman filter-based hydrologic data assimilation methods, which can integrate measurements into a non-linear model without a need to alter the model. In the EnKF, uncertainty of background prediction is stochastically simulated using a Monte Carlo approach: state variables, model parameters and forcing data are perturbed to explicitly simulate the background prediction error ideally without affecting the average of the ensemble predictions. However, interplay between nonlinear model physics existing in the hydrologic model can result in systematic biases in background prediction. In this work, it is shown that perturbed soil moisture states in the hydrologic data assimilation system can cause significant biases in the deeper layer soil moisture and water/energy fluxes at the land surface. To demonstrate the perturbation biases, the Noah land surface model in the NASA Land Information System (LIS) is run over a medium-scale basin located in Oklahoma, USA. Implications of the ensemble perturbation biases are discussed and a simple method to correct the biases is introduced. Results from a series of synthetic twin experiments indicate a remarkable improvement in the prediction of deeper layer soil moisture, surface water/energy fluxes and runoff at the expense of slightly degrading the ensemble spread of surface soil moisture near the dry bound.
Impacts of Ensemble Perturbation Biases in Hydrologic Prediction and Data Assimilation
Dongreoyl Ryu, Wade T. Crow, and Xiwu Zhan
University of Melbourne
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 11:00-11:30,
Calibration of conceptualised rainfall-runoff models typically involves determining a single set of parameter values that give an optimal fit between model estimates and time series of stream flow measurementsan approach known as batch estimation. An alternative is to estimate model parameters sequentially through time as observations become available. The sequential approach has the potential of tracking temporal variation in parameter values that may result from changes in catchment physical properties; whereas the batch approach assumes static parameter values over the entire observation period. Methods examined in this study exploit Bayesian statistics and Monte Carlo simulation in what is broadly refer to as Markov chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC). This talk identifies some of the challenges encountered when these approaches were used to estimate parameters of simple rainfall-runoff models. Results of synthetic investigations are presented, along with discussion on their application to real stream flow observations.
Batch and sequential approaches to rainfall-runoff model parameter estimation based on Monte Carlo simulation
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 11:30-12:00,
Changes to weather patterns, with increasing incidence of coastal flooding in recent years, have led to growing concern over the effects of climate change on flooding and highlighted the importance of accurate knowledge of coastal morphology in natural disaster prediction and management. It is essential that we improve our ability to predict floods; being able to better identify and anticipate flood risk would facilitate the development of suitable strategies for the management of coastal areas and help to limit the damage and distress caused by flooding. Key to this is better knowledge and understanding of how the morphology of the coastal zone is evolving over time. Accurate bathymetry immediately prior to a storm event would allow improved flood forecasting using coastal inundation models.
Coastal morphodynamics presents a challenge to modellers. Difficulties in modelling the underlying physical processes, in setting the initial state of the model, and in setting the model parameters mean that morphodynamic models often perform poorly in practice. A complementary approach to improving model performance is to combine model integrations with observations of morphology using data assimilation techniques.
While data assimilation has been used in atmospheric and oceanic prediction for some years, it has rarely been used for coastal morphodynamic modelling, despite the availability of suitable observations from a variety of sources. Here we consider a novel application of the technique and describe how data assimilation can also be used to deliver improved estimates of uncertain morphodynamic model parameters. This is achieved using state augmentation. Earlier applications of state augmentation have typically employed the 4D-Var, Kalman filter or ensemble Kalman filter assimilation schemes. Our new method is based on a computationally inexpensive 3D-Var scheme.
The aim of this paper is to demonstrate parameter estimation using 3D Var data assimilation for a simple 1D model of bed-form propagation. The long term objective is to implement a parameter estimation scheme in a full morphodynamic assimilation-forecast system. However, the use of a simple model in the current work allows ideas to be developed, tested and understood without the obfuscating features of a more complex system. Our results show that 3D Var can be used successfully for parameter estimation. The scheme is capable of recovering near perfect parameter values and therefore improves our models capability to predict future bathymetry. Such positive results suggest the potential for application to more complex morphodynamic models.
Variational data assimilation for morphodynamic model parameter estimation
Polly Smith, S.L. Dance, M.J. Baines, N.K. Nichols, & T.R. Scott
University of Reading
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 12:00-12:30,
Throughout the past two decades, the connections between physical processes and ecosystems in the marine environment have been investigated using marine Biogeochemical (BGC) models. The current state-of-the-art spatially resolved BGC models are deterministic in nature and are rarely accompanied by quantitative estimates of uncertainty in model parameters or predictions. Contemporary Bayesian Hierarchical methods offer a new and potentially powerful approach to address uncertainty. Using a physical statistical-model, which combines a stochastic formulation of the BGC model and statistical inference techniques, a more quantitative measure of model parameters and predictions can be made. Initial results from a highly idealised BGC model are encouraging and yield useful insight into some of the problems we may face in a more complex formulation of the physical-statistical model.
Parameter estimation techniques in a simple biogeochemical model using Bayesian methods
Thursday 19 February, 9th Floor Seminar Room, 700 Collins, 2:00-3:00,
There are several issues regarding the use of ensembles that are often glossed over. Some concern the factors that determine analysis error within data assimilation systems. Others concern the relationships between SVs and BGVs. More fundamental ones concern the behavior of atmospheric chaos, including aspects insufficiently examined and understood. A few of these issues will be described and some outstanding questions posed.
Pertinent issues and open questions regarding the use of ensembles for weather analysis and prediction