Numerical models use a subset of the full fluid dynamics and thermodynamics equations describing the atmosphere. Subsets that are used for tropical cyclone track prediction include barotropic, non-divergent; shallow-water equations; truncated baroclinic and full primitive-equation models. The essential elements of these systems are:
- A grid of points at which atmospheric parameters are held (grid-point model), or a series of polynomials or spectral functions that approximate the atmospheric fields (spectral models);
- One or more distinct layers or levels in the vertical;
- An analysis/assimilation cycle to obtain the initial fields;
- A means of bogussing observations to provide additional information in data-void regions;
- A method for integrating the model equations forward in time;
- Some form of physical parameterisation to incorporate the effects of systems smaller than can be resolved by the grid or spectral wave numbers being used; and,
- A means of handling forcing from the model boundaries.
Details of these processes are beyond the scope of this Guide and may be found in standard texts, such as Haltiner and Williams (1979). Here we provide some of the essentials and requirements for such processes.
All operational models, including spectral models, start with atmospheric fields on a regular grid array, on which the required atmospheric parameters are held. Model grids can have any shape, they do not need to be regular, or square. Grids that are used include various types of map projections, such as a latitude/longitude array, a Lambert conformal grid, or a polar stereographic projection. The grid may be distorted to provide higher resolution in a specified region or near a system of interest. Alternatively, higher resolution grids may be successively nested within each other to provide a telescope effect with a high resolution focus on the region of interest. The grid also may be regular, with all data held at common grid-points, or it may be staggered, with data split between two overlapping grids.
The vertical levels, or layers, are defined by using some form of vertical coordinate. The -coordinate is a popular choice and takes the form:
where ps and pt are the atmospheric pressure at the earth's surface and the top of the model, respectively. This coordinate has the advantage of adjusting the variable surface orography into a "flat" surface with a value of =0. Most operational models use terrain-following coordinates similar to these.
It is normal to vary the vertical resolution considerably, especially to provide high resolution in the atmospheric boundary layer and in the upper-level outflow region, where sharp vertical gradients occur.
Analysis and assimilation consists of taking all available data and converting them to a form suitable for model integration. The data are obtained from a wide variety of sources and instruments, including direct temperature, moisture and pressure from radiosondes, remotely sensed temperatures from TOVS, winds from satellite cloud drift calculations, etc. These observations must be combined together into an initial analysis in which the various atmospheric fields are balanced and internally consistent, to minimise shock to the model during the initial integration. Observations with high consistency and low error characteristics also need to be given priority over concomitant observations of lower quality.
The cycle commences with some form of quality check of incoming observations for obvious errors. Examples of such checks include testing for unrealistic vertical gradients in a radiosonde profile, unacceptably high differences from the short-term model forecast for the same time, or from nearby observations (often called buddy checking).
Analysis of the data onto a model grid can be achieved in a number of ways. A typical cycle is for the model forecast or the previous analysis to be used as a first guess. This field is subtracted from all observations to obtain a set of difference data, which are then weighted according to their known error characteristics (for example a radiosonde observation is weighted higher than a TOVS temperature retrieval). These data are then transformed to the model grid using some form of statistical interpolation (Seaman and Hutchinson, 1985), or surface fitting routine (Ooyama, 1987). The horizontal and vertical influence of each observation is determined from empirical experience and from the number of observations in the region. For example, a single observing station with no surrounding observations may be given a wider influence. Also a strong wind observation may be given an elliptical influence with the long axis aligned in the wind direction.
The analysis may be either univariate or multivariate. Univariate analyses treat each variable independently. In a multivariate approach, the analysis of one variable is dependent on the analysis of one or more other variables. For example, the wind field may be analysed in a manner which maintains balance with the pressure or height field. Typically, univariate analyses are used in the tropics and at the mesoscale where poor balance exists between pressure and wind fields. Multivariate analyses are used for synoptic-scale mid-latitude flow.
Observations and analyses may be incorporated into the model forecasts directly, in what is called a "cold start" for the model. However, fields are not necessarily in a suitably balanced state for the model. This leads to an initial shock to the model, with degradation of the forecasts whilst the system adjusts. A much better approach is to assimilate the observations or analyses directly into the model.
Assimilation may be achieved in a variety of ways and is a topic of current highly mathematical research. Current models may assimilate data "continuously" or "intermittently". During continuous assimilation observations are introduced in small increments at or near the time they were taken as the model integrates forwards. Intermittent assimilation involves stopping the model at specified times, usually 6 hourly, to bring in new data and analyses, then restarting the integration. Analyses resulting from these procedures normally have imbalances between the mass and wind fields. Such imbalances can lead to generation of high frequency gravity waves in the early stages of the model integration.
Most models employ some form of initialisation to minimise these undesirable waves. The most commonly used method is called nonlinear normal-mode initialisation. Nudging also may be used during intermittent assimilation; this involves starting the model some time before the latest analysis time then integrating forwards whilst nudging it to move towards the analyses. At the same time the convective parameterisation may be specified to provide heating in observed regions of convective activity. The result is an initial model field that is as close as possible to the latest analysis, whilst also being in balance with the model requirements.
Another form of assimilation is to use previous model analyses to derive vertical profiles of temperature from satellite observed radiances. This approach has proven to be successful in reducing the biases and occasional wild errors that can plague such derivations from climatological information. Significant improvements in operational forecasts have been demonstrated (Miller, 1992)
In regions of poor data coverage, it is sometimes useful to include bogus observations derived from human interpretation or empirical relationships to provide an indication of major weather patterns that would otherwise go unobserved. Here we describe two approaches that have been successful for tropical cyclone predictions: moisture and cyclone vortex bogussing.
22.214.171.124 Moisture Bogussing
In the tropics, temperature fields are relatively uniform by comparison with the sharp frontal gradients of higher latitudes. But moisture fields are highly variable and require accurate representation for adequate prediction of convective clouds. A technique pioneered by the Japan Meteorological Agency is to develop an empirical relationship between the black-body temperatures (including variances) from geostationary satellites and moisture profiles from available radiosondes (Mills and Davidson, 1987). This relationship is then applied to the current satellite image to provide a set of bogus moisture observations for inclusion in the analysis cycle. Such bogus moisture observations are used operationally in Australia and Japan with excellent results.
126.96.36.199 Tropical Cyclone Bogussing
Since most tropical cyclones have very few observations in the vicinity, they often go undetected by standard analyses or are analysed very poorly, with centres ill defined and in the wrong location. Such initial errors obviously have a major impact on the forecast of cyclone tracks and many attempts have been made to provide bogus vortices to approximate the cyclone. These attempts are helped by observations and theory (Elsberry, 1987) that the motion of tropical cyclones is not overly sensitive to details of the inner structure. Care must be taken, however, with the cyclone size and outer structure and with careful specification of the near environmental flow.
Most operational centres use some form of tropical cyclone bogus. These all utilise estimated location and intensity of cyclones from satellite imagery, together with current structural knowledge of tropical cyclones and any other available observations to generate mass and wind data representative of the system to be bogussed.
Two basic methodologies are employed for representing the structure of the bogus tropical cyclone: the circulation may be analytically defined or the model may be used to spin up a set of standard cyclones. These two approaches are shown using the Japanese Meteorological Association, the US NMC, and the US Naval Fleet Weather Center as examples.
Bogussing Procedures at JMA: Bogussing in the JMA Typhoon Model (TYM) includes the following steps (Iwasaki et al., 1987):
Bogussing Procedure at the US NMC: The operational QLM model at the US NMC also incorporates a detailed specification of the initial vortex (Mathur, 1991). After an initial vortex specification similar to that for the JMA, the internal steering flow is incorporated by adding a dipole similar to the beta gyres of Fiorino and Elsberry (1989 a,b). Mathur (1991) has shown that track forecast errors for westward moving storms may be significantly reduced by addition of this dipole. During late 1992, bogussing also was introduced into the NMC global system.
Bogussing Procedure at the US Navy Fleet Weather Center: A set of bogus cyclones are spun up using the forecast model. To accomplish this, a Rankine vortex without mean flow, a mean tropical sounding and a constant sea surface temperature (301 K) are specified as initial conditions. The model is then integrated (spun-up) to a steady-state solution and the winds, heights, temperature, moisture and sea-level pressure within 1600 km are used for the bogus cyclone. Several cyclones at different latitudes are used.
Inserting the chosen bogus cyclone into the analysis is a difficult task. At present, two less than optimal methods are used. In the first, the cyclone is smoothly added to the original analyses (eg., the JMA and US NMC). In the second the bogus cyclone circulation is used to generate a set of observations that are inserted into the analysis cycle (eg., the US Navy and Australian NMC).
Both methods have shortcomings. The bogus vortex addition can introduce significant shocks to the system in the early model integration, which can degrade the subsequent model forecast. In the second method, the bogus data (and any conventional data around the storm) can be rejected by the objective analysis scheme. The rejection occurs because of unacceptable differences from the first guess used by the analysis. This can be somewhat alleviated by use of more appropriate structure functions and error limits in the cyclone vicinity (eg. Puri and Lönnberg, 1990). But problems still remain.
Another major problem in cyclone bogussing is that the poorly analysed and specified vortex in the first-guess analysis must be removed. If not done correctly, this "ghost" vortex can severely degrade the track forecast. In addition available observations in the cyclone vicinity should be used optimally. Recent work (Kurihara et al., 1993) indicates that an initial smoothing of the fields in the cyclone vicinity followed by insertion of the vortex using a non-linear balance equation approach may provide a solution to the ghost vortex problem, whilst inserting a nicely balanced vortex to minimise model shock.
A major limitation also occurs in specifying the highly asymmetric upper-tropospheric outflow layer. Recent work by Wang and Holland (1993) and Holland and Wang (1993) indicates that the best approach may be to specify only the lower -tropospheric cyclonic vortex and leave the upper-troposphere to the often bountiful cloud-drift winds and for the model to spin-up.
All numerical models operate on a finite grid, or with a finite number of waves to describe the spectral signature. As a result, they cannot adequately resolve small-scale features, such as cumulus convection, or the interchange of energy between the surface and the ocean. Those "sub-grid-scale" processes that have an important impact on the forecasts are included by parameterising their effects on the larger scale.
The basis for such parameterisation are based on early theoretical work, such as the Monin-Obhukov boundary-layer approximations, and on observational diagnostics such as those for convection originally proposed by W. Gray and quantified by Yanai, Esbenson and Chu (1973). In essence, some resolvable characteristics of the atmosphere are related to the development of sub-grid processes. A separate set of relationships are then invoked to adjust the large-scale fields for the impact of these processes.
We take cumulus convection as an example. This is known to be related to the large-scale convergence in the lower troposphere and the degree of convective instability. One type of scheme, called convective adjustment, "adjusts" the atmosphere back to a defined lapse rate whenever convective instability of a prescribed amount develops. Alternative schemes depend on both a conditionally unstable atmosphere and local convergence. Once such conditions are satisfied, convection is assumed to occur and a parameterisation scheme is invoked. Such schemes can be relatively simple and based on a defined vertical heating profile with magnitude specified by the degree of moisture convergence (eg Kuo, 1974). Or they can be very complex and based on equilibrium conditions between a population of cloud types (Arakawa and Schubert, 1974). In all cases, the aim is to provide the model with an indication of changes required as a result of the unresolved convective activity.
A very important consideration for limited area numerical forecasts is the quality of the boundary conditions (see, eg, Errico and Baumhefner, 1987). This is because atmospheric waves and disturbances generated at the boundary can rapidly propagate throughout the domain and swamp the model forecast cycle. Two types of boundary therefore need to be carefully handled: the real boundaries at the earth surface and top of the atmosphere; and the pseudo horizontal boundaries required because of capacity limitations with computers used for operational forecasting. At the earth's surface, models often have many layers close together to help provide the best simulation of such surface effects as local orography and SST. For a region of several thousand kilometres per side, the model solution will tend to be swamped by the horizontal boundary conditions after 2-3 days. For this reason, regional forecast models are always nested into global forecast systems.
Contents Chapter 8.3